News Release

A tactile robot finger with no blind spots

Columbia Engineers first to demonstrate a robotic finger with a highly precise sense of touch over a complex, multicurved surface

Peer-Reviewed Publication

Columbia University School of Engineering and Applied Science

Tactile Fingers Progressing through Its Manufacturing Stages

image: Tactile fingers progressing through its manufacturing stages: 3D-printed skeleton, flexible circuit board, transparent silicone layer, and reflective skin. view more 

Credit: Pedro Piacenza / Columbia Engineering

Columbia Engineers first to demonstrate a robotic finger with a highly precise sense of touch over a complex, multicurved surface.

New York, NY--February 26, 2020--Researchers at Columbia Engineering announced today that they have introduced a new type of robotic finger with a sense of touch. Their finger can localize touch with very high precision--<1mm--over a large, multicurved surface, much like its human counterpart.

"There has long been a gap between stand-alone tactile sensors and fully integrated tactile fingers--tactile sensing is still far from ubiquitous in robotic manipulation," says Matei Ciocarlie, associate professor in the departments of mechanical engineering and computer science, who led this work in collaboration with Electrical Engineering Professor Ioannis (John) Kymissis. "In this paper, we have demonstrated a multicurved robotic finger with accurate touch localization and normal force detection over complex 3D surfaces."

Current methods for building touch sensors have proven difficult to integrate into robot fingers due to multiple challenges, including difficulty in covering multicurved surfaces, high wire count, or difficulty fitting into small fingertips, thus preventing use in dexterous hands. The Columbia Engineering team took a new approach: the novel use of overlapping signals from light emitters and receivers embedded in a transparent waveguide layer that covers the functional areas of the finger.

By measuring light transport between every emitter and receiver, they showed that they can obtain a very rich signal data set that changes in response to deformation of the finger due to touch. They then demonstrated that purely data-driven deep learning methods can extract useful information from the data, including contact location and applied normal force, without the need for analytical models. Their final result is a fully integrated, sensorized robot finger, with a low wire count, built using accessible manufacturing methods and designed for easy integration into dexterous hands.

The study, published online in IEEE/ASME Transactions on Mechatronics, demonstrates the two aspects of the underlying technology that combine to enable the new results. Firstly, in this project, the researchers use light to sense touch. Under the "skin," their finger has a layer made of transparent silicone, into which they shined light from more than 30 LEDs. The finger also has more than 30 photodiodes that measure how the light bounces around. Whenever the finger touches something, its skin deforms, so light shifts around in the transparent layer underneath. Measuring how much light goes from every LED to every diode, the researchers end up with close to 1,000 signals that each contain some information about the contact that was made. Since light can also bounce around in a curved space, these signals can cover a complex 3D shape such as a fingertip.

"The human finger provides incredibly rich contact information--more than 400 tiny touch sensors in every square centimeter of skin!" says Ciocarlie. "That was the model that pushed us to try and get as much data as possible from our finger. It was critical to be sure all contacts on all sides of the finger were covered--we essentially built a tactile robot finger with no blind spots."

Secondly, the team designed this data to be processed by machine learning algorithms. Because there are so many signals, all of them partially overlapping with each other, the data is too complex to be interpreted by humans. Fortunately, current machine learning techniques can learn to extract the information that researchers care about: where the finger is being touched, what it is touching the finger, how much force is being applied, etc.

"Our results show that a deep neural network can extract this information with very high accuracy," says Kymissis. "Our device is truly a tactile finger designed from the very beginning to be used in conjunction with AI algorithms."

In addition, the team built the finger so it, and others, can be put onto robotic hands. Integrating the system onto a hand is easy: thanks to this new technology, the finger collects almost 1,000 signals, but only needs a 14-wire cable connecting it to the hand, and it needs no complex off-board electronics. The researchers already have two dexterous hands (capable of grasping and manipulating objects) in their lab being outfitted with these fingers--one hand has three fingers, and the other one four. In the next months, the team will be using these hands to try and demonstrate dexterous manipulation abilities, based on tactile and proprioceptive data.

"Dexterous robotic manipulation is needed now in fields such as manufacturing and logistics, and is one of the technologies that, in the longer term, are needed to enable personal robotic assistance in other areas, such as healthcare or service domains," Ciocarlie adds.

###

About the Study

The study is titled "A Sensorized Multicurved Robot Finger with Datadriven Touch Sensing via Overlapping Light Signals."

Authors are: Pedro Piacenza and Matei Ciocarlie, Mechanical Engineering; Keith Behrman and Ioannis Kymissis, Electrical Engineering; and Benedikt Schifferer, Computer Science.

The work was sponsored in part by the National Science Foundation, under its CAREER program (grant IIS-1551631) and a National Robotics Initiative (grant CMMI-1734557).

The authors declare no financial or other conflicts of interest.

LINKS:

Paper: https://ieeexplore.ieee.org/document/9006916 DOI: 10.1109/TMECH.2020.2975578

https://ieeexplore.ieee.org/xpl/RecentIssue.jsp?punumber=3516

Video: https://youtu.be/PVw8Qy7BHU0

http://engineering.columbia.edu/

https://engineering.columbia.edu/faculty/matei-ciocarlie

https://me.columbia.edu

http://www.cs.columbia.edu/

https://datascience.columbia.edu/

https://engineering.columbia.edu/faculty/ioannis-kymissis

http://www.ee.columbia.edu/

Columbia Engineering

Columbia Engineering, based in New York City, is one of the top engineering schools in the U.S. and one of the oldest in the nation. Also known as The Fu Foundation School of Engineering and Applied Science, the School expands knowledge and advances technology through the pioneering research of its more than 220 faculty, while educating undergraduate and graduate students in a collaborative environment to become leaders informed by a firm foundation in engineering. The School's faculty are at the center of the University's cross-disciplinary research, contributing to the Data Science Institute, Earth Institute, Zuckerman Mind Brain Behavior Institute, Precision Medicine Initiative, and the Columbia Nano Initiative. Guided by its strategic vision, "Columbia Engineering for Humanity," the School aims to translate ideas into innovations that foster a sustainable, healthy, secure, connected, and creative humanity.


Disclaimer: AAAS and EurekAlert! are not responsible for the accuracy of news releases posted to EurekAlert! by contributing institutions or for the use of any information through the EurekAlert system.