News Release

Researchers develop low-cost sensor to enhance robots' sense of touch

Researchers from Queen Mary University of London, along with collaborators from China and USA have developed an L3 F-TOUCH sensor to enhance tactile capabilities in robots, allowing it to "feel" objects and adjust its grip accordingly

Peer-Reviewed Publication

Queen Mary University of London

Image 1

image: The sensor feeling the objects. view more 

Credit: F-TOUCH Team

Achieving human-level dexterity during manipulation and grasping has been a long-standing goal in robotics. To accomplish this, having a reliable sense of tactile information and force is essential for robots. A recent study, published in IEEE Robotics and Automation Letters, describes the L3 F-TOUCH sensor that enhances the force sensing capabilities of classic tactile sensors. The sensor is lightweight, low-cost, and wireless, making it an affordable option for retrofitting existing robot hands and graspers. 

The human hand can sense pressure, temperature, texture, and pain. Additionally, the human hand can distinguish between objects based on their shape, size, weight, and other physical properties. Many current robot hands or graspers are not even close to human hands as they do not have integrated haptic capabilities, complicating handling objects. Without knowledge about the interaction forces and the shape of the handled object, the robot fingers would not have any "feel of touch," and objects could easily slip out of the robot hand's fingers or even be crushed if they are fragile.  

The study, led by Professor Kaspar Althoefer of Queen Mary University of London, presents the new L3 F-TOUCH - high-resolution fingertip sensor, where L3 stands for Lightweight, Low-cost, wireLess communication. The sensor can measure an object's geometry and determine the forces to interact with it. Unlike other sensors that estimate interaction forces via tactile information acquired by camera images, the L3 F-TOUCH measures interaction forces directly, achieving higher measurement accuracy.  

"In contrast to its competitors that estimate experienced interaction forces through reconstruction from camera images of the deformation of their soft elastomer, the L-3 F-TOUCH measures interaction forces directly through an integrated mechanical suspension structure with a mirror system achieving higher measurement accuracy and wider measurement range. The sensor is physically designed to decouple force measurements from geometry information. Therefore, the sensed three-axis force is immuned from contact geometry compared to its competitors. Through embedded wireless communications, the sensor also outperforms competitors with regards to integrability with robot hands." says Professor Kaspar Althoefer. 

When the sensor touches the surface, a compact suspension structure enables the elastomer – a rubber-like material that deforms to measure high-resolution contact geometry exposed to an external force – to displace upon contact. To make sense of this data, the elastomer’s displacement is tracked by detecting the movement of a special marker, a so-called ARTag, allowing us to measure contact forces along the three major axes (x, y, and z) via a calibration process. 

“We will focus our future work on extending the sensor's capabilities to measure not only force along the three major axes but also rotational forces such as twist, which could be experienced during screw fastening while remaining accurate and compact. These advancements can enable the sense of touch for more dynamic and agile robots in manipulation tasks, even in human-robot interaction settings, like for patient rehabilitation or physical support of the elderly.” adds Professor Althoefer. 

This breakthrough could pave the way for more advanced and reliable robotics in the future, as with the L3 F-TOUCH sensor, robots can have a sense of touch, making them more capable of handling objects and performing complex manipulation tasks. 


Disclaimer: AAAS and EurekAlert! are not responsible for the accuracy of news releases posted to EurekAlert! by contributing institutions or for the use of any information through the EurekAlert system.