News Release

How can robots learn from humans?

A sensory-control synergy makes robots learning universal grasping skills from humans

Peer-Reviewed Publication

Science China Press

Human-Robot Collaboration

image: 

Collaboration between humans and robots, building an intelligent future.

view more 

Credit: Rong Zhu’s Research Team, Tsinghua University. The photo was taken by the researchers.

When human picks up a fragile raw egg, or a slippery and heavy metal cup, human hand naturally adjusts its grip according to its tactile experience to prevent slipping or crushing. For robots, such adaptive grasping remains a major challenge.

Now, a team of researchers from Tsinghua University has developed a new approach to teach robots to learn human grasping skills. In a study published in National Science Review, they report a sensory-control synergy approach building a new framework to characterize human grasping experience that allows robots learning grasping skills from humans.

“Humans use tactile sensation to recognize grasping states in real time and instantly fine-tune grip according to grasping states,” says Prof. Rong Zhu, corresponding author of the study. “We want to give robots that similar ability — to sense, cognize, and act on tactile feeling in real time.”

Bio-inspired sensory-control synergy framework

To capture the process of human grasping, the researchers first developed a tactile glove equipped with homemade tactile sensors on the fingertips. The glove wearing on human hand captures multimodal tactile data of contact, slip, and pressure during grasping actions.

The key innovation in this study lies in the new strategy for cognizing multimodal tactile data. Inspired by human neurocognition, the real-time measured tactile data are encoded into high-level semantic grasping states, such as “stable,” “slightly unstable,” or “highly unstable.” This abstraction encoding method eliminates the uninformative dimensions of tactile data due to deviations in grasp positions or postures across individual grabs, which greatly enhances its universality.

“Instead of training the robot with thousands of precise tactile measurements during grasping various objects, we teach the robot to recognize general states of interaction” Prof. Rong Zhu says. “This strategy makes the approach data-efficient and highly transferable.”

Furthermore, a fuzzy logic controller that mimics human decision-making experience is developed to make grasping actions. For example, if the state is “highly unstable,” the robot grabs more intensively; if it’s “stable,” it holds the grab.

Human-taught Robotic grasping

The built human sensory-control synergy can be easily transferred into a robotic hand equipped with tactile sensors. And then, the robot is competent to adaptively grasp diverse objects, including slippery umbrella, fragile raw egg, and heavy bottle. It also can be generalized well to new, unseen objects.

Notably, the human sensory-control synergy framework can be efficiently built using small datasets created by one person. Experimental validation demonstrates that the robot with the transferred human sensory-control synergy framework achieves an average grasping success rate of 95.2% in grasping slippery, fragile, soft, and heavy objects. In dynamic tests, the robot can resist external pulls and prevent slips by sensing disturbances and increasing grips autonomously.

The team also demonstrated robot accomplishing a real-world task of hand-brewing coffee. From locating items and scooping coffee powder to stirring and serving, the robot utilized its tactile-based control to handle uncertainties at each step.

“Robots learn universal grasping through understanding sensory and control logic behind sensing data rather than imitating human motion trails,” says Prof. Zhu. “We teach robots the ability to draw inferences from one instance.”

This human-like sensory-control synergy approach enables robots to effectively learn the experience of humans, achieving adaptive and universal grasping to diverse objects, providing a promising avenue for intelligent robot accomplishing tasks in real-world scenarios.

C Wei et al. Human-Taught Sensory-Control Synergy for Universal Robotic Grasping, National Science Review, 2025, nwaf583.  doi.org/10.1093/nsr/nwaf583

The research group headed by Prof. Rong Zhu is from State Key Laboratory of Precision Measurement Technology and Instrument, Department of Precision Instrument, Tsinghua University.


Disclaimer: AAAS and EurekAlert! are not responsible for the accuracy of news releases posted to EurekAlert! by contributing institutions or for the use of any information through the EurekAlert system.