How do people orient themselves when they are in a new area? How do we use street signs or houses, for instance, to estimate the distance we have traveled? Put simply: how do we update our mental map? Neuroscientists have been studying such questions in animals to learn about the basic principles of spatial cognition. "Until now, we have envied an invention from the world of science fiction: a holodeck like they have in Star Trek," says Prof. Dr. Andrew Straw. The holodeck is a space which can simulate any desired virtual world. "Something like the holodeck from Star Trek would enable key experiments in which we could artificially decouple an animal's movement from its perception," says the Freiburg professor of biology. Together with his colleague Prof. Dr. Kristin Tessmar-Raible from the Max F. Perutz Laboratories, a joint venture of the University of Vienna and Medical University of Vienna, Austria, and an international team, Straw has constructed a kind of holodeck and with it created new opportunities for researching spatial cognition. The animals perceived the simulated objects as real and changed their behavior in different visual environments. The research team describes its results in the Nature Methods journal.
Animals and humans use every available sensation to update their mental map. However, every movement is inextricably linked to sensation. So the researchers had to decouple the two processes in order to understand how the brain processes different information. The group built a flexible system for mice, flies and fish - three species commonly used in neurobiology and behavior research. "We created an immersive, 3-D virtual reality in which the animals could move freely," explains Straw, "because we wanted our visual scenery to tie in naturally with the animal's own action-perception cycle." The visual landscapes included for example vertical pillars, multiple plants, and a swarm of video game space invaders. Multiple high-speed cameras tracked and recorded the precise 3-D position of the animal. A computer program registered each movement of the animal within milliseconds allowing an updated video to be constantly projected on the walls.
The researchers created experiments using visual stimuli to change and control flies' direction of flight, to test if mice are afraid of virtual heights, or to make fish move virtually between two different worlds and change their behavior depending on the visual environment. In another experiment they simulated a swarm of space invaders for a real fish to swim in. The computer-controlled swarm was programmed to react to the fish as one of their own and the fish's own behavior was clearly influenced by the swarm. In addition, the team was able to solve a fundamental problem in collective behavior research: historically, it has been impossible to directly manipulate interaction between multiple individuals. Together with the group of Prof. Dr. Iain Couzin of the University of Konstanz and the Max Planck Institute for Ornithology, the researchers developed a photo-realistic model of a swimming fish that could be computer controlled, and showed that real fish most reliably follow the virtual fish when the virtual fish matches its swim direction to the real fish.
In addition to the laboratories mentioned above, the work also involved the lab of Dr. Wulf Haubensak from the Institute of Molecular Pathology Vienna, and the lab of Prof. Dr. Karin Nowikovsky from the Medical University of Vienna.
John R Stowers, Maximilian Hofbauer, Renaud Bastien, Johannes Griessner, Peter Higgins, Sarfarazhussain Farooqui, Ruth M Fischer, Karin Nowikovsky, Wulf Haubensak, Iain D Couzin, Kristin Tessmar-Raible, Andrew D Straw: Virtual reality for freely moving animals. In: Nature Methods. DOI: 10.1038/nmeth.4399