News Release

FEFU scientists trained robots to make independent decisions in a changing environment

Scientists developed software allowing AI robots with technical vision to adjust the movement trajectories in real time

Peer-Reviewed Publication

Far Eastern Federal University

A team of scientists from the School of Engineering of Far Eastern Federal University, Institute of Marine Technology Issues, and Institute of Automation and Management Processes of the Far Eastern Branch of the Russian Academy of Sciences developed software allowing industrial AI robots with technical vision to set out and adjust the movement trajectories of their tools in real time without reducing given precision levels.

The report of the team was recognized as the best in its session at the ICCAD'19 conference that took place in Grenoble (France) on July 2-4.

The team from FEFU developed and implemented a new principle for smart industrial robots control - the management of program signals. According to it, robots are able to set and adjust the trajectories and regimes (speeds) of tools movement on their own while processing details under uncertain conditions and in a changing working environment.

The new software allowed the team to get around 0.5 mm precision in the operation of robotic tools (including the actions that require additional force application). However, many high-accuracy operations require precision within the 0.2-0.1 mm range.

"The issue lies in the imprecise technology used to manufacture the robots themselves, and it hasn't been resolved anywhere in the world yet. We've already developed a method to eliminate this defect based on special test movements. It proved to be efficient in models, and right now we are working to implement it in practice. If we obtain positive results, it would be a breakthrough in the practical application of robots in general. And if no, we'd continue to work until we have a positive result. Generally, this is a working method," said Professor Vladimir Filaretov, a PhD in Technical Sciences, the Head of the Department for Automation and Management at the School of Engineering, FEFU, a Honored Science Worker of Russia, a Honoured Inventor of Russia, and a Honoured Engineer of Russia.

Using a technical vision system, a machine forms a virtual image of its workspace, recognizes each piece, and determines its exact position. A robot can also identify deformations in large pieces that occur in the course of their fixation. Based on the virtual image, it determines the trajectories of its working tools.

"It's important to emphasize that the methods, algorithms, and software developed by us are of universal nature. They can be used to control almost any types of robots: industrial robots, underwater devices, unmanned ground vehicles, flying, and many promising agricultural robots. They only require minor adjustments that are already included into the software and take into account their specific features. Our developments, including smart VR-based control, maximize on the capabilities of modern technologies and are able to increase the efficiency of technological processes by several times while preserving the quality of the products," added Professor Filaretov.

The new smart control method has already been implemented at the Dalpribor plant (Vladivostok) and is currently being tested and adjusted in view of the recent industrial challenges. The most recent update of the technology was presented at the IEEE International Conference on Control, Automation and Diagnosis 2019 (ICAAD'19) in Grenoble and got special recognition.

Based on the results of its work, a group of five scientists under the supervision of Professor Filaretov applied for the Russian Federation Government Prize.

###

The work was carried out with the support of the Government of the Russian Federation (state contract No. 2.11216.2018/11.12), Presidential Grant No. MK-1987.2018.8, and Program No. 29 of the Russian Academy of Sciences.


Disclaimer: AAAS and EurekAlert! are not responsible for the accuracy of news releases posted to EurekAlert! by contributing institutions or for the use of any information through the EurekAlert system.