image: Somatosensory cortex activity in the human brain
Credit: International credit: Netherlands Institute for Neuroscience Dutch credit: Nederlands Herseninstituut
In collaboration with universities across the world, Nicholas Hedger (University of Reading) and Tomas Knapen (Netherlands Institute for Neuroscience & Vrije Universiteit Amsterdam) explored the depths of the human experience. They discovered how the brain translates the visual world around us into touch, thereby creating a physical embodied world for us to experience. “This aspect of human experience is a fantastic area for AI development.”
Imagine you’re cooking dinner with a friend when suddenly they cut themselves. Within milli-seconds, you have pulled a face, winced, and perhaps even flinched your own hand. This bodily sensation is a result of your own brain’s activation of the touch-centre, or somatosensory cortex.
How is it possible that our sense of touch is activated purely by looking at another person? To answer this question, researchers from the UK, USA, and VU, NIN (KNAW) in Amsterdam explored this phenomenon using an unusual approach: watching Hollywood films.
Tomas Knapen (last author) and his colleague Nicholas Hedger (first author) analysed a dataset where participants were in a scanner and shown clips from Hollywood films like The Social Network or Inception. Knapen and Hedger’s final goal would be to use this data and identify the underlying brain structures that help us truly experience what we see.
Mapping our experiences
When scientists talk about “maps” in the brain, they mean the way our brain organises information about the body and the world around us. In the somatosensory cortex, for example, the entire body is mapped onto that specific region. On one end, it processes touch to the feet. On the other end, it processes touch on the head. These maps help the brain understand where a sensation comes from. Discovering similar maps in the visual cortex is exciting because it suggests that the brain links what we see directly to what we feel.
“We found not one, or two, but eight remarkably similar maps in the visual cortex!”, Knapen explains. “Finding so many shows how strongly the visual brain speaks the language of touch.”
These maps mirror the body’s organization in the somatosensory cortex from head to toe — suggesting that when we see someone, the brain organises that visual information in the same bodily way it does when we feel something.
Different map, different purpose
So, what do these maps actually do, and why are there so many of them? It turns out that different maps can have different purposes. Some maps focus heavily on identifying body parts while others focus on placing them in space. “I think that there are many more purposes, but we just haven’t been able to test them yet”. Knapen adds.
Each map can help you extract specific information depending on your goal at the time. “Say you stand up and grab a cup of coffee. If I’m interested in what you’re doing, I will probably focus on your hand grabbing the cup. Now imagine that I’m more interested in your emotional state. In that case, I might focus more on your overall posture or your facial expressions. Every time you look at a person, there are many different bodily translations that need to be conducted visually. We think that these maps are a fundamental ingredient in that exact process”.
While having all these different overlapping maps may seem inefficient, Knapen believes the opposite. “This allows the brain to have many types of information in a single space, and make a translation in any way that is relevant in that moment”, he explains.
Further research and the future of neurotechnology
These findings offer opportunities for many follow-up studies. The involvement of these body maps in emotional processing forms a foundation for further investigation in social psychology — but could also be used in a more clinical setting. “People with autism can struggle with this sort of processing. Having this information could help us better identify effective treatments”, Knapen explains.
In the long run, these maps could even aid in the development of neurotechnology. “Training sets for brain implants often start off with instructions like ‘try to think of a movement’. If these bodily processes can be activated in much broader ways, then there might be much broader possibilities to train and develop those brain computer interfaces”.
Knapen believes his findings could drive future AI advancements. “Our bodies are deeply intertwined with our experiences and understanding of the world. Current AI primarily relies on text and video, lacking this bodily dimension. This aspect of human experience is a fantastic area for AI development. Our work shows the potential for very large, precision brain imaging datasets to fuel this development: a beautiful synergy between neuroscience and AI.”
But while looking to the future, Knapen still seems taken up with the implications of his current findings. “I just want to understand the depths of the human experience, and it really feels like we just found this central ingredient for it”.
Source: Nature
Journal
Nature