News Release

Virtual twin in 10 minutes

CITEC Cluster of Excellence Project accelerates avatar generation

Business Announcement

Bielefeld University

40 Cameras Photograph Him Simultaneously

image: In order to test the new process, doctoral student Jascha Achenbach had 40 cameras photograph him simultaneously (left). Ten minutes later, the virtual version of the researcher was finished (right). view more 

Credit: CITEC/Bielefeld University

In addition, users see themselves as avatars - virtual copies of themselves in the mirror of the virtual room. The creation of such personalised avatars used to take several days, but CITEC researchers have now developed an accelerated process.

Avatars - virtual persons - are a core element of ICSpace, the virtual fitness and movement environment at Bielefeld University's Cluster of Excellence Cognitive Interaction Technology (CITEC). The system makes it possible to practise and improve motion sequences by providing individualised feedback in real time. The system is embodied by a virtual person acting as a coach. In addition, users see themselves as avatars - virtual copies of themselves in the mirror of the virtual room. The creation of such personalised avatars used to take several days, but CITEC researchers have now developed an accelerated process.

In order to create avatars for the ICSpace system, the researchers "scan" people. The computer scientists use a circular array of 40 DSLR cameras to photograph the respective person from all sides and use these images to compute several million three-dimensional sample points on the person's body. A generic virtual human model is fitted to this data in such a way that it corresponds to the shape and appearance of the person scanned. "Our virtual human model was generated from more than one hundred 3D scans and contains statistical knowledge about human body shape and movement," says Professor Dr. Mario Botsch, head of the Computer Graphics and Geometry Processing research group and one of the coordinators of the ICSpace project. "Only through this model are we able to create avatars quickly and automatically."

The resulting virtual people can be animated in detail: they can move all joints, even individual fingers, and communicate through facial expressions, speech and gestures. "The most important feature, though, is that they reflect the user photorealistically," says Botsch. This is crucial because personalised avatars are much more readily accepted by users. This is shown in a study carried out by computer graphic researchers from Bielefeld in cooperation with Professor Dr. Marc Latoschik from the University of Würzburg. "The study shows that users identify better with such a custom-tailored individualised avatar than with an avatar that does not resemble them, even if it looks similarly realistic," says Latoschik, who holds the chair for Human-Computer Interaction in Würzburg.

"Until a few months ago, the individual processing steps for creating avatars were scarcely automated," says Botsch. The new process has changed this.

For the current study, his team has developed algorithms that accelerate the complete processing of the photo data right up to the animatable avatar. "This way, we can now generate the avatar of any person within ten minutes," says Jascha Achenbach, lead author of the resulting publication. "We create the virtual avatars in a format that is also used by the computer games industry," says Thomas Waltemate, who, like Achenbach, works in Botsch's research group. This makes the avatar generation also interesting for commercial use.

The researchers presented their development of accelerated avatar generation a month ago in Gothenburg (Sweden) at the renowned conference "ACM Symposium on Virtual Reality Software and Technology". The study on how personalised avatars are accepted by users will be presented at the IEEE Conference on Virtual Reality and 3D User Interfaces, the world's leading conference on virtual reality, in the spring of 2018.

The virtual training environment ICSpace is a joint development by six research groups of the Excellence Cluster CITEC. ICSpace stands for "Intelligent Coaching Space". The system analyses the movement of athletes and rehabilitation patients and helps to correct them. It is based on an open space with two projection walls (front and floor), known as Cave Automatic Virtual Environment (CAVE). The CAVE makes it possible to simulate a walk-in, computer-generated virtual environment. Test subjects wear 3D glasses similar to those worn in the cinema. It is the first system of its kind worldwide to simulate the complete training process and adapt flexibly to the user's behaviour. Mario Botschcoordinates the project together with computer scientist Professor Dr. Stefan Kopp and sport and cognitive scientist Professor Dr. Thomas Schack.

ICSpace is one of four large-scale projects at CITEC. The Excellence Cluster CITEC is providing the project with a total of 1.6 million euros in funding until the end of 2017. The other projects are the robot service apartment, the walking robot Hector and the self-learning grasp system "Famula" CITEC will be funded by the German Research Foundation (DFG) on behalf of the German federal and state governments (EXC 277) as part of the Excellence Initiative until the end of 2018. In the new Excellence Strategy of the federal and state governments, Bielefeld University is applying for a cluster based on the research of the current Excellence Cluster CITEC.

###

Original publication:

Jascha Achenbach, Thomas Waltemate, Marc Latoschik, Mario Botsch: Fast Generation of Realistic Virtual Humans. Proceedings of ACM Symposium on Virtual Reality Software and Technology. http://bit.ly/2C5XP5D, published in November 2017.


Disclaimer: AAAS and EurekAlert! are not responsible for the accuracy of news releases posted to EurekAlert! by contributing institutions or for the use of any information through the EurekAlert system.