News Release

Sounds modify visual perception: A study reveals new links between hearing and vision in the rodent brain

A new study reveals the existence, in rats, of direct cross-modal interactions between distinct sensory modalities, capable of modulating perceptual processing. Specifically, auditory signals were found to compress the visual perceptual space i

Peer-Reviewed Publication

Scuola Internazionale Superiore di Studi Avanzati

Sounds can alter the way the brain interprets what it sees. This is the key finding of a new study by SISSA researchers in Trieste, published in PLOS Computational Biology. The research shows that, when sounds are paired with moving visual stimuli, the latter are perceived differently by rats. In particular, auditory cues systematically alter vision by compressing the animals’ “perceptual space”. Derived from the integration of behavioural experiments and computational modelling, the researchers’ findings indicate that auditory signals exert an inhibitory influence on visual perception. The study thus provides a new perspective on how the senses communicate within the brain, revealing that even direct connections between primary sensory areas — not only integration within higher-order association cortices — can profoundly influence perceptual experience.

How the Senses Communicate in the Brain

In the brain, information from the different senses is first processed within specialised regions devoted to each modality (known as unimodal areas). From these regions, sensory inputs then converge onto specialised areas, known as association cortices, where they are integrated to construct a coherent, multimodal representation of the environment. Such processing is highly sophisticated and occurs within cortical regions endowed with complex cognitive capabilities. “However, inputs from one sensory modality — such as sound — may directly affect the processing of another, such as vision, through connections linking unimodal cortical areas,” explains Davide Zoccolan, who coordinated the experimental component of the study. "This phenomenon appears to be less prominent in non-human primates and in humans, yet is believed to be more pronounced in rodents. Nevertheless, previous investigations have yielded inconsistent findings: while some reported an enhancement of visual neuronal responses in the presence of sounds, others identified a suppressive effect. With our work, we sought to clarify precisely this issue at the perceptual level.”

Experiments to Understand How sounds Influence Vision

The researchers at SISSA combined behavioural experiments with computational modelling. A group of rats was trained to classify visual stimuli according to their temporal frequency, while being simultaneously exposed to sounds that were entirely irrelevant to the task. The temporal frequency of these sounds, however, could either match that of the visual stimuli or not. Zoccolan explains: “One might expect that visual cortex spontaneously integrates sounds (although task irrelevant), and that auditory stimuli that are congruent with the visual ones would enhance visual perception, while incongruent sounds would impair it. But this is not the case.”

When Sound Inhibits Vision

The approach adopted by the Trieste research team made it possible to rule out high-level multisensory integration, thus assessing the direct influence of auditory signals on visual perception. As Eugenio Piasini, who coordinated the computational component of the study, explains, “the results show that rat visual classification performance was systematically altered by the presence of sounds, in a manner proportional to their intensity but independent of their temporal modulation. Specifically, pairing sounds with moving visual stimuli led to a compression of the visual perceptual space. The overall effect of auditory stimuli on visual ones was therefore inhibitory.” In summary, in the presence of intense sounds, the temporal frequency of visual stimuli was substantially underestimated by the rats.

Experimental Data Confirmed by Computational Modelling

To understand the mechanisms underlying this phenomenon, the researchers developed a Bayesian model combined with a neural coding scheme in which visual neurons were inhibited by concurrent sounds in proportion to their intensity. As Eugenio Piasini explains, “the model reproduced the experimental data with remarkable accuracy, supporting the hypothesis that auditory inputs can selectively inhibit the activity of visual neurons and thereby modify perception.”

A New Perspective on the Multisensory Brain

“Our study – concludes Zoccolan – provides a new perspective on how the senses communicate within the brain, showing that even direct connections between primary sensory areas — not only multimodal integration within higher-order association cortices — can profoundly influence perceptual experience. Once the inhibitory effect of sounds on vision has been established, it will certainly be interesting to determine whether the reverse also holds true. As for the evolutionary and ecological processes underlying this mechanism, we can only speculate. This inhibitory interaction between auditory and visual processing may reflect inter-areal competition mechanisms through which the brain enhances the salience of one modality while suppressing another – for instance, by prioritising auditory perception, which can more rapidly alert the organism to the presence of potential predators. Undoubtedly, these animals live in a multisensory world that is far more complex and integrated than we could imagine.”


Disclaimer: AAAS and EurekAlert! are not responsible for the accuracy of news releases posted to EurekAlert! by contributing institutions or for the use of any information through the EurekAlert system.