News Release

Do you need to see to gesture? New research reveals how blind people express concepts without vision

Hands in motion, eyes closed

Peer-Reviewed Publication

Max Planck Institute for Psycholinguistics

A team of researchers at the Max Planck Institute for Psycholinguistics set out to investigate whether people who are blind gesture like sighted people when talking about the world, and how their unique perceptual experience might influence the way they express different kinds of concepts.
 

The study: Hands in Motion, Eyes Closed

In the first-of-its-kind study, blind and sighted participants were asked to perform two tasks. First, they heard a word - like “pen” or “bridge” - and had six seconds to use only gestures to represent it. Second, they listed features of the same words in writing. These tasks covered 60 concepts, including manipulable objects (e.g., pen, spoon), non-manipulable objects (e.g., bridge), and animals (e.g., lion).

By combining gesture and language tasks, the researchers aimed to understand whether language alone is enough to shape gesture use, or whether visual experience plays a key role.


Key findings: experience shapes expression

The study uncovered several compelling insights:

  • Blind participants gestured less frequently than sighted people when representing objects they couldn’t physically handle or directly experience - like bridges or lions.
  • For manipulable objects, however, blind and sighted participants produced similar gestures, often mimicking how the object is used - like writing for “pen.”
  • When it came to listing features in writing, both groups showed strikingly similar knowledge across all categories.

These findings suggest that while blind people possess conceptual knowledge comparable to sighted individuals, visual experience influences how this knowledge is expressed through gesture. In particular, concepts that rely heavily on seeing things from a distance are harder to represent without vision.

“This is the first study to show that blind people’s gestures are shaped by their lack of visual experience,” lead researcher Ezgi Mamus noted. “It reveals how communication strategies adapt when one sense is absent, and highlights the complex interplay between language, perception, and expression.”

The social implications are wide-reaching. By understanding how people with different perceptual experiences communicate, we can improve education, accessibility tools, and assistive technologies. This research also has potential applications in developing AI systems that generate or interpret gestures in more human-like ways, especially across diverse user populations.


Follow up

The team’s next step is to investigate how well blind people’s gestures are understood by sighted listeners. This will further clarify how gestures function as a bridge between thought and communication across sensory differences.


Disclaimer: AAAS and EurekAlert! are not responsible for the accuracy of news releases posted to EurekAlert! by contributing institutions or for the use of any information through the EurekAlert system.