Tuesday, December 23, 2008

Personalities as Problem Solving Aides

Last post I talked about personalities in computing, but I only briefly touched on the idea as to why we may want personalities in computer programs or robots.

Aside from the obvious applications where a personality is necessary - robot pet, disabled care bots, sex bots, etc - the idea of putting a personality into the equation doesn't seem like a good one.

The most obvious example is Clippy. The basic idea of Clippy was that Word was getting rather complicated, and MS wanted some way to have a human-feeling tutorial system. A "smart guide".

Of course, it was a travesty, and is a sterling example of why NOT to make your programs have a personality. After all, if the solution or path is clear, the user doesn't need someone popping up and getting in their way. And, usually, if the solution or path isn't clear, it's easier to do a keyword search rather than clumsily interact with some primitive personality program.

But I think there are situations in which personable programs are required. A personable program may not have an overt, anthropomorphic personality, but there are times when even going that far might be useful.

First off, the basic idea of a personable program is that it understands the user better than current programs do. As time goes on and our need for context-sensitive data increases, our programs are going to have to get better at handling that. One example is a music bot that would pull music off the internet for you to listen to. Right now, the music bot software that's out there can be customized to your overall preferences, but it doesn't understand that sometimes you feel like listening to reggae, sometimes pop, sometimes filk. Humans have moods, and the current generation of software is only dimly aware of that fact.

A more advanced music bot would try to track your moods, trying to determine if, for example, you generally like funk in the morning and rap in the evening. A more physically present robot would probably try to determine your mood by scanning you - facial expression, body temperature, what clothes you decided to wear, whatever.

At this point the software can't be said to have an anthropomorphic personality. There's no pair of glasses with a mouth popping up and saying "it looks like you're trying to listen to Aerosmith. Would you like some help?"

But, despite that, you can see that the program is developing some rudimentary personality. In trying to predict your personal preferences, it will show personality, at least as far as you are concerned. Today it played military marches all day. Why? Yesterday it seemed to like the B52s. How interesting. It's got a personality, although not a clear or deep one.

Now the issue is that you've got this thing with a personality, but your interactions with it are extremely limited, largely consisting of fast-forwarding through songs. So, when it gets some weird idea in its head, you don't really have any way to (A) figure out why it's doing that or (B) get it to stop doing that.

On the other hand, if we do have a pair of glasses with a mouth pop up, we can see why it thinks what it thinks. Obviously, a cutesy icon is not what I'd choose. I'd probably either go for a blinking cursor or a sexy librarian, but we'll stay neutral for the moment.

You are fast-forwarding through songs it really thought you would like. Instead of flailing around randomly to try to determine if you're in a shitty mood or have a guest or something, it can gently pop up and say "Hey, what's the deal? You in a bad mood or something?"

And you can type back - "entertaining guests" or something. The software will understand, or at least fake it, and try to find a mode of music that fits your interests better.

...

Now, at this stage, I've only covered about a quarter of the reason that software may require personalities.

The other half of the first half is that, as time goes on, this kind of contextual data feeding will become more and more critical and overwhelming. What sounds like a cute little semi-feature will become much more necessary when you're, say, trawling through Twitter, Flickr, YouTube, and whatever comes out next trying to siphon out today's trends.

The second half happens on the back end - the side that doesn't face you. The programs require a personality because the data they'll be filtering - and how they filter it - is based on personalities.

If you don't have a personality, how can you form a meaningful opinion on whether a cute cat video is fun or not? On whether your friend (another piece of software) would like to hear this news from an earthquake-destroyed city? On this insider detail about a specific game developer...

As the volume of our media increases, we'll see a corresponding increase in the number of agents (programs) trawling through it, trying to make sense of it. Because the media is fundamentally based on humans, it's easiest to judge if you have some semblance of human in you. A personality.

This is especially true when it comes to creating and navigating the semantic net that will arise from all this data filtering...

So, in long, I think that we're going to see a rise in personality-filled programs because we're going to see a rise in the number of programs that have to interact with personalities.

No comments: