Bioinspired triboelectric-driven multisensory framework for cross-modal associative learning. (IMAGE)
Caption
Bioinspired triboelectric-driven multisensory framework for cross-modal associative learning. Schematic illustration of a bioinspired multisensory framework enabled by triboelectric sensing and artificial neural networks (TES-ANN). Signals from vision, touch, hearing, smell, and taste are first captured by distributed triboelectric sensors and encoded into neural-like electrical spikes. These multimodal signals are transmitted and modulated before being integrated within an artificial neural network that performs cross-modal associative learning. Through this process, information from one sensory modality can be reconfigured into another, enabling recognition, inference, and imagination in a self-powered and energy-efficient manner, mimicking key features of human multisensory cognition.
Credit
Yao Xiong, Yang Liu, et al.
Usage Restrictions
Credit must be given to the creator.
License
CC BY