Public Release: 

Disney Research transforms movie-quality animations for interactive viewing

New method makes it easier to reuse film assets for games and virtual reality

Disney Research

Cinema-quality animations and virtual reality graphics that need to be rendered in real-time are often mutually exclusive categories, but Disney Research has developed a new process that transforms high-resolution animated content into a novel video format to enable immersive viewing.

The end-to-end solution the researchers devised will make it easier to repurpose animated film assets for use in video games and location-based VR venues. Viewers wearing head-mounted displays can interact with movie animations in a new way, based on the position and orientation of their heads.

"This new solution promises huge savings on the most costly aspects of interactive media production," said Professor Kenny Mitchell, senior research scientist.

The researchers will present their real-time rendering method May 16 at the Graphics Interface 2017 conference in Edmonton, Alberta.

"We've seen a resurgence in interest in virtual reality in recent years as companies have released a number of head-mounted displays for consumers," said Professor Markus Gross, vice president at Disney Research. "The subsequent demand for VR and other immersive content is driving innovations such as this ground-breaking set of methods for reusing rendered animated films."

Virtual reality scenes must be rendered in real-time and that performance requirement means using animations that are less complex than the highly detailed animations typical of feature films. That means when artists produce an interactive experience tied-in to the film or a related video game, they have to convert the film animations into a lower-quality form compatible with real-time rendering or game engines. That process is both laborious and expensive.

Mitchell and his colleagues opt instead for an approach that relies on automated pre-processing: The 3D scenes are rendered from the perspective of a number of camera positions calculated to provide the best viewpoints for all of the surfaces in the scene with as few cameras as possible and encode it all in a modular video format. This content can then be rendered in real-time from an arbitrary point of view, allowing for motion parallax, head tilting and rotations.

"This process enables consumption of immersive pre-rendered video in six degrees of freedom using a head-mounted display," said Babis Koniaris, a post-doctoral associate on the team. "It can also be used for rendering film-quality visuals for video games."

In addition to Mitchell and Koniaris, the research team included Maggie Kosek and David Sinclair. The research was partly supported by the Innovate UK project #102684, titled OSCIR.

Combining creativity and innovation, this research continues Disney's rich legacy of leveraging technology to enhance the tools and systems of tomorrow.

###

For more information on the process, including a video showing example scenes, visit the project web site at http://www.disneyresearch.com/real-time-rendering-with-compressed-animated-light-fields/.

About Disney Research

Disney Research is a network of research laboratories supporting The Walt Disney Company. Its purpose is to pursue scientific and technological innovation to advance the company's broad media and entertainment efforts. Vice President Markus Gross manages Disney Research facilities in Los Angeles, Pittsburgh and Zürich, and works closely with the Pixar and ILM research groups in the San Francisco Bay Area.  Research topics include computer graphics, animation, video processing, computer vision, robotics, wireless & mobile computing, human-computer interaction, displays, behavioral economics, and machine learning.

Website: http://www.disneyresearch.com

Twitter: @DisneyResearch

Facebook: http://www.facebook.com/DisneyResearch

Disclaimer: AAAS and EurekAlert! are not responsible for the accuracy of news releases posted to EurekAlert! by contributing institutions or for the use of any information through the EurekAlert system.