News Release

Biology inspires perceptive machines

Peer-Reviewed Publication

IST Results

The team brought together electronic engineers, computer scientists, neuroscientists, physicists, and biologists. It looked at basic neural models for perception and then sought to replicate aspects of these in silicon.

"The objective was to study sensory fusion in biological systems and then translate that knowledge into the creation of intelligent computational machines," says Martin McGinnity, Professor of Intelligent Systems Engineering and Director of the Intelligent Systems Engineering Laboratory (ISEL) at the University of Ulster's Magee Campus and coordinator of the Future and Emerging Technologies(FET) initiative-funded SENSEMAKER project of the IST programme.

SENSEMAKER took its inspiration from nature by trying to replicate aspects of the brain's neural processes, which capture sensory data from eyes, ears and touch, and then combines these senses to present a whole picture of the scene or its environment. For example, sight can identify a kiwi, but touch can help tell if that kiwi is ripe, unripe or over-ripe.

What's more, if one sense is damaged, or if a sensory function is lost due to environmental factors, say because it can't see in the dark, the brain switches more resources to other senses, such as hearing or touch. Suddenly those faculties become comparatively hypersensitive. When it goes dark the brain pours resources into these two senses, along with hearing and smell, to extract the maximum possible data from the environment.

The team concentrated on two particular senses, namely sight and touch. The experimental touch-sensor system, developed in Heidelberg and used by the SENSEMAKER partner Trinity College, Dublin, is itself quite novel. It features an array of small, moveable spring-loaded pins. This enabled psychophysical experiments on touch and vision to be conducted on humans and was a very valuable tool in exploring human responses to sensory integration. The results from these experiments helped to inform the sensory fusion model.

Modelling sensory fusion

The project has created a sophisticated, biologically-inspired model of sensory fusion, for tactile and visual senses. Perhaps the greatest achievement of the project is the creation of a framework which allows extensive experimentation in terms of sensory integration. The project's work can easily be extended into other sensory modalities; for example the project partners are currently planning to extend the work to auditory senses. The hardware implementation(s) of the model, which allow for extremely rapid learning as compared to biological timescales, will be exploited in follow-up projects.

"Using these systems we were able to show that the merging of tactile and visual information, or sensory fusion, improved overall performance," says Professor McGinnity. The ultimate outcome of this type of research is to implement perception capabilities in computer systems, with applications in a wide range of areas including robotics.

But a greater understanding of biological sensory fusion, and how to implement it in artificial systems, could do potentially much more.

"This type of research teaches us a lot about how biological systems work, and it could lead to new ways of treating people with sensory-related disabilities, though that kind of outcome will take a long time," says Professor McGinnity.

He says intelligent systems need to adapt to their environment without reprogramming; they need to be able to react autonomously in a manner that humans would describe as intelligent; for that they need a perception system that enables them to be aware of their surroundings.

Two other projects will carry aspects of their work further. The FACETS project, also funded by FET will continue to explore machine perception, focusing on vision. Meanwhile ISEL at Magee Campus is actively engaged in a major proposal to create a Centre of Excellence in Intelligent Systems. The Centre will progress a range of research problems related to the creation of intelligent systems, including sensory fusion, learning, adaptation, self-organisation, the implementation of large-scale biological neural sub-systems in hardware and distributed computational intelligence.

The project has brought to the team's biologists and neuroscientists a greater knowledge and understanding of the engineering approach to problem solving and system design; conversely the engineers on the team benefited from a vastly improved insight into the world of biological system modelling. Overall the project has contributed to an improved understanding of how biological systems merge multimodal sensory information. This is one of the most difficult problems in science today; the results of the SENSEMAKER project are being disseminated in high quality international journals, reflecting the fact that the research performed in this project is at the state-of-the-art. Both biological and neurological science on the one hand, and machine intelligence and computer science on the other have benefited from its successful conclusion.

###

Contact:
Professor Martin McGinnity
School of Computing and Intelligent Systems
Faculty of Engineering
University of Ulster
Northern Ireland
Tel: +44-28-71375417
Email: tm.mcginnity@ulster.ac.uk

Source: Based on information from SENSEMAKER

PLEASE MENTION IST RESULTS AS THE SOURCE OF THIS STORY AND, IF PUBLISHING ONLINE, PLEASE HYPERLINK TO: http://istresults.cordis.lu/


Disclaimer: AAAS and EurekAlert! are not responsible for the accuracy of news releases posted to EurekAlert! by contributing institutions or for the use of any information through the EurekAlert system.