On the screen, computer generated game characters shrug, wink, nod, wave, or cross their arms with skeptical hostility as they follow your every move with attentive gaze. A University of Southern California-developed system module called "Social Puppet" is pulling the strings.
Once a given character is designed, a set of standard commands orchestrates a whole range of non-verbal expressions of mood and attention. The same commands work for any other character in the game.
"Human communication is only partly verbal," says Hannes Högni Vilhjálmsson of the USC Information Sciences Institute, who designed Social Puppet.
He calls the software an "engine" to generate visual social behavior, and will present it at the AAAS Annual Meeting in St. Louis on Friday, February 20 at 2 p.m.
Vilhjálmsson is one of the builders of a set of ISI- created videogames called "Tactical Language and Culture" that the armed forces now use to teach language and customs to soldiers quickly. Hundreds of soldiers have trained with "Tactical Iraqi," while a "Tactical Pashto" is being readied for Afghanistan.
In the games, learners control figures representing themselves, who interacts with other characters who are animated by artificial intelligence, a specialty at ISI, which is part of the USC Viterbi School of Engineering.
"To introduce players to a culture that is unfamiliar to them," notes Vilhjálmsson in his presentation, "it is important to have them both observe nonverbal behavior that reflects the culture, and have them be able to perform - appropriate behaviors in return."
Additionally, "when having a conversation face-to-face, people rely on - spontaneous non-verbal cues such as gesture, gaze and head movement. This is even more critical when trying to have a conversation in an unfamiliar language."
Finally, Vilhjálmsson says, students need feedback from the game to know how well they are communicating. "Puzzled or offended expressions are much more intuitive clues than, say, a printed message saying 'your words not understood'."2/13/2006
The director of the Tactical Language and Culture Project is ISI's Lewis Johnson. "People naturally tend to rely heavily on nonverbal communication when they are learning a foreign language, to make up for their lack of verbal knowledge." he says.
"But nonverbal gestures can sometimes be a source of confusion, since people in different cultures tend to employ different gestures. So we believe that it is important to include nonverbal communication in Tactical Language and Culture, both to promote verbal learning and to train people to communicate effectively face to face with people in other cultures."
Vilhjálmsson says the idea for the Social Puppet "engine" comes from the physics engines now in use in many videogames, in which laws of physical behavior are programmed into speeding cars in racing games, so that too fast a turn creates a spinout.
"The sciences that study human social behavior have discovered a lot of regularities that we can program into virtual characters as baseline behavior. It has always been a dream of mine to take that literature and make it really accessible and useful to the game programmer."
He gives examples of the kind of standard gestures he means. "Animating gaze as a function of turn-taking, or Animating a reaction to another's attempt to make eye- contact based on attitude and availability.
"These are things we take for granted in the real world, but have to be programmed into virtual characters," he continues. "So why do that from scratch every time when we can exploit and implement models that exist in the literature?"
"As computer games become more socially interactive," he concludes, "there will be an increasing demand for 'plug-in' AI engines that orchestrate believable social performances. "
Vilhjálmsson's work on Social Puppet was financed by DARPA, and carried on at ISI's Center for Advanced Research in Technology for Education (CARTE).