Monday, October 15, 2007

Social Gameplay: Gestures

If you've been reading this blog long, you know that I'm always trying to figure out a way to do social play. Especially with NPCs. The problem is twofold:

First, to do reasonable social play you need to have a very high grain of interaction. You need to be able to express a wide range of emotions. Second, it has to flow smoothly. "Turn based" social games have, so far, been so bad as to be worse than not having the social element at all.

So, you have to have a way of smoothly, rapidly expressing a wide range of emotions.

A bunch of ways have been proposed, such as time-pressured icon-clicking, or a 2D field where you move the mouse to express yourself along two dimensions. These kinds of methods are far from ideal. That's ivory tower speak for "they suck, but I'm trying not to be too offensive."

I was working on this parkour game, where you control your body with your left stick and your hands with the right stick. And I hit a little bump: talking to people. I didn't want to shunt you out to a completely different scheme ("dialog tree"), because that would ruin the immersion, which was the point of the game. So I decided that if you stood still near a person, your controls would enter a "gesture" mode instead of a "run around like a madman" mode. This would let you express yourself via gestures.

The more I thought about it, the more I grew to like it. At some point, the parkour element dropped out of my design entirely.

Think about it. Your left stick controls your body posture. Moving it a nudge up and down would make you nod your head. Pushing it all the way forward would make you lean in conspiratorially. Pulling it back would make you lean away. You can lean forward akilter, wiggle back and forth, turn your back, etc.

That's just the left stick. The right stick controls your hand(s). A little horizontal waggle might let you wag your finger negatively, whereas a big horizontal waggle would let you make a negatory cutting gesture. Circling across the top would make you do a big wave, whereas just gently pushing it partway up would make you do a little greeting hand-raise.

Combine the two sticks. The difference between leaning forward and throwing your arms wide apart versus backwards while doing the same doesn't seem like much to our game-numbed senses, but it's significant. While leaning forward means "I'm gonna hug ya", leaning backwards usually means either "you're gonna hug me" or "what'd I do?" depending on the hand position.

Obviously, the game would need to have some kind of "smart" system for handling the hands, and probably a system of clear feedback that would label your gesture for you, so you can learn the language. After all, it won't map perfectly to your imagination, because everyone's a bit different. Similarly, as your character gets more agitated, their hand gestures can change to more aggressive ones...

Anyhow. The only way I see to do a real-time high-bandwidth interaction like social communication is to teach the players a language built for it. And teaching the players to simply map a language they are already familiar with seems best. Sort of like the way that many people learn sign language: by mapping English (or whatever their language is) to gestures, even though sign language has a fair number of fundamental differences from English (or whatever).

This "language" (my gestural one, not sign language) is pretty limited, but that can't be avoided: I've only got two thumb sticks rather than a hundred thousand muscles that can all operate independently. So this system won't let you talk about the weather - you may still have to have some method of choosing a topic - but it will let you actually TALK about the weather, rather than selecting canned and limited responses to a scripted event.

Also entertaining would be the culture barrier: if in a real-world setting, imagine the difference between gesturing to an American and gesturing to a Japanese guy. You're going to get completely different responses because the gestures are going to be interpreted differently. (As the most obvious example, they have head nods and shakes reversed. Oh, they're so silly.)

It could be fun... what do you think?

2 comments:

Patrick said...

Yeah I had an idea similar to this in '06 while in the EGW - lots of potential there.

Imagine what you could do with an AGI powered NPC interfacing through a WiiWare client.

Craig Perko said...

Naw, I don't think the Wiimote is suitable at all. A thumbstick is a constrained system with distinct "edges", like your body. The wiimote is indistinct and blurry. Good for some things, but I wouldn't use it for this.