Friday, October 26, 2007

Constructing Artificial Emotion: Me Being an Ass

So, I've read Daniel Cook's (Danc's) Gamasutra article. I don't like it. I don't like it because he's wrong.

"Them's pretty strong words! What makes you so sure?"

Well, because he's wrong in the same way I was wrong when I came up with it. Do you think the eyes of the White Tower are blind?

The theory is far from unique. Years ago, I personally came up with it and tested it myself. I nibbled, chewed, refined, eventually came up with key theory, as linked above. It's re-discovered by every theorist who stumbles through. It is a deeply flawed theory, for all the attractive simplicity.

However, flawed theories are fine if they allow for useful implementation. This DOES NOT.

He aggressively pursues this theory that by putting familiar emotional content in front of someone, they will feel a strong emotion. Unless that emotion is IRRITATION, that isn't so. There are dozens of other conditions required, ALL of which seem more useful to study than this theory.

For example, attention, wanting to be in the mood, not being in a different mood, being distractionless, not having shifted the context, not having shifted your feelings on the matter...

Plus, most humans feel deeply about things they've never felt personally. I've never been really hungry. Every religious Jewish person has been hungrier than me. Yet pictures of starving people upset me greatly. I'm sure you also have many zany associations that are not explained by this theory.

There are a lot of theories that might explain it. Even now, you're probably thinking, "oh, but sympathetic simulation solves that..."

... or was that just me?

The fact of the matter is that simulating intelligence of any kind is extremely difficult. It's easy to think that you HAVE solved something, but it falls apart when you try to implement it. This is not a case of the industry not listening to academics. This is a case of the academics not having a clue.

A big part of the problem is simplicity. Even if we had actual human brains hooked into the games, they would not be convincing humans. At best, they would end up like WoW characters ("LFT? lvl60shaman?"). More likely, they would end up as a state machine that occasionally got into a snit.

This is made very clear if you are familiar with how humans in isolated, extreme conditions begin to act. For example, smaller polar bases. Within a few months of isolation, their social behavior has simplified to a state machine you would be ashamed to put into a game. Plus, they tend to obsess over dumb details, such as Sunday's dinner. Lack of deep IO.

Even the most complex simulated world is so heavy-handed and low-grain that a polar base looks downright lush. Hell, a broom closet looks downright lush.

Increasing the dexterity with which an NPC can interact with the world doesn't help, because there's no significant world to interact with.

To me, it's obvious that we won't get very far by trying to simulate from the bottom up. Even trying to get a complex enough world is computationally impossible.

We have to cheat.

Danc is aware of this on some level, and is cheating by storing up elements that have been proven to cause emotions in some way for some audience subset. The idea being that once you have a stock of these elements, you can use them to do something meaningful. Something that causes emotion.

But the approach is much too heavy-handed. I don't want to see my face in a game, and I certainly don't want to see my life. I'm not even sure such a game would be psychologically safe: if done right, it would totally fuck you up.

Besides, the best stories are told by someone with a vision. They aren't remixing collected symbols, at least not on such a simplistic level. The content in their stories is carefully crafted to cause emotion, and they use content we've never experienced. I've never been on a quest to destroy a ring, or flown through a nebula, or gotten mixed up with the mafia. But these things cause a predictable emotion in me and millions of other people.

As ever, the point of a story is to say something.

I would much rather work on a system which helps game designers with THAT.

...

music is much better at causing emotion, anyway

2 comments:

Daniel Cook said...

Nice! That is more of what I was hoping for.

First, where did I talk about simulating intelligence? That sounds my essay seen through Perko colored glasses. :-)

Second, I did find it interesting that as the thought exercise unfolded, single player games never entered the equation. I doubt they are all that great at evoking emotions unless you ended up relying on narrative tricks.

Honestly, I was leaning more towards games like the Bachelor...where copious tears that come about through putting humans together in situations that encourage drama.

Thirdly, narrative, that light and empty syrup slathered on so many unfortunate games, was discussed in the essay out of necessity. It is hard to write about emotion in games without putting the predominant method of evoking emotions into context.

Still, I don't find your counter arguments very convincing. Have you been hungry? Then you can sympathize. Perhaps someone who had a personal experience with starvation would exhibit a more extreme response. Yes, narrative is complicated. Yes, we have basic error checking in place so that we don't start hallucinating about oranges when we see an orange tunic. But the basic concept that we bring our experience to the art at hand seems like such a stunningly obvious comment that it isn't worth refuting.

And lastly, I agree about the psychologically damaging part. Once you start generating real emotional experiences, there is the danger that they will overwhelm the player. Here's a thought experiment.
- Ask a young child if he wants to play a game call "Getting picked first". He may say yes.
- Take the child to a far off summer camp. Wake him up early so that he doesn't get enough sleep.
- Don't feed him breakfast.
- Tell him that the person who gets picked last in kickball is a failure. Have several people mention this.
- Tell him that being part of the kickball team means you'll have friends.
- Secretly tell the team captains to pick him last. For impact, have them sneer when they look at him.
- Observe.

Is that a morally acceptable game? I doubt it. Quite likely you'll end up with at least one or two screwed up children in a population of a thousand people who play this little game.

Games that focus on emotional experiences wield real power. And that can be damaging.

Appreciate that you took the time to post on the essay,
Danc.

Craig Perko said...

I don't have any other color glass to see it through...

Besides, you're really trying to simulate a writer/director.

If you're going to have multiplayer games that evoke emotion, the important part is the emotion between the players, not storing up images for later. People have a strong response to other people.

LARPs are a great example of this. If you've ever played or watched a good LARP, you watch everyone's emotions ricochet off one another. They go out of their way to enter a given mindset (feel a given emotion) despite the fact that, in normal life, they would feel nothing of the sort even faced with the same situations.

Sympathy is a dramatically more important element, ESPECIALLY for multiplayer games.

I used to have a whole section of counter arguments, which I deleted because it went on for a page. Here are a few:

I don't feel sorry for hungry children because they're hungry. They don't look hungry. They look dead.

I get emotional about space launches. How is that covered?

Music changes my mood, and it has nothing to with what I was feeling when I first heard similar music. That's also not covered.

I feel differently about the same situations now than ten years ago. For example, a lot of associations have turned to nostalgia.

Also, someone's existing mood produces enough interference that the emotion you try to introduce will be down to chance anyway.

All of these things CAN be dealt with by expanding the theory. I expanded it in my own way, as I linked. But the core theory is at best incomplete, and more likely not a useful algorithm.