To explore a new environment, a robot must observe nearby objects and landmarks, and also keep track of its own location to make sense of the features it observes. That problem is hard enough if you’re an airborne drone or a ground-based autonomous vehicle. Move underwater, though, and things get really difficult: GPS doesn’t work, so robots must deduce their location from patchy, uncertain information gleaned from a patchy, uncertain environment.
Now, researchers at Stevens Institute of Technology have developed an underwater robot capable of simultaneously mapping its environment, tracking its own location, and planning safe routes through complex and dangerous marine environments in real-time, using its own predictions about uncertainty to make decisions.
The breakthrough is significant: their algorithm enabled the autonomous robot explorer to be the first-of-its kind deployed in a real-world marine environment to monitor and manage its level of uncertainty about its own location, based on the areas it has explored and the landmarks it encountered.
“This is a big step forward,” said Brendan Englot, project lead and interim director of the Stevens Institute for Artificial Intelligence whose work will appear, with no embargo, in the next issue of the IEEE Journal of Oceanic Engineering. “Underwater mapping in an obstacle-filled environment is a very hard problem, because you don’t have the same situational awareness as with a flying or ground-based robot — and that makes sending a robot underwater an inherently risky process.”
The team, led by Englot, tested their device — and the “active SLAM” (known as simultaneous localization and mapping) algorithms that power its robot brain — in a crowded marina.
While many researchers treat SLAM problems as a conceptual challenge, using datasets to find optimal mapping strategies post-facto, Englot and his team were able to bring their algorithms into the real world to guide their robot’s mapping strategies in real time. Operating at a depth of 1 meter, the team’s highly customized BlueROV2 robot was able to explore and accurately map a busy harbor at the U.S. Merchant Marine Academy in King’s Point, NY.
Englot’s team’s robot uses sonar signals to detect objects and chart its environment. It is also capable of forecasting the way its level of uncertainty will change if it moves into different neighboring regions, and to weigh how useful it would be to identify new landmarks in a given area of the map. That allows the robot to intelligently plot routes that minimize uncertainty while still capturing useful information about unexplored regions.
“Essentially, the robot knows what it doesn’t know, which lets it make smarter decisions,” Englot said. “By creating a virtual map that accounts for the robot’s own confidence about where it is and what it’s seeing, the robot can quickly, safely, and accurately map a new environment.”
The team’s “active SLAM” technology could have countless real-world applications, from streamlining harbor repairs to building and maintaining off-shore wind farms, aquaculture projects, and drilling rigs. Similar algorithms could also help autonomous vehicles to operate in any area where GPS signals can’t reach, from indoor areas such as parking structures, to underground environments, or even outdoor areas where tree cover impedes GPS functionality.
For now, Englot and his team at Stevens are working to ruggedize their robot platform to enable longer-lasting undersea missions, and also upgrading its sonar capabilities and algorithms to enable mapping of more complex three-dimensional environments.
“There’s a burgeoning need for this technology,” Englot said. “The hardware still needs to evolve a bit, but what we’re doing is highly relevant to a wide range of commercial interests.”
IEEE Journal of Oceanic Engineering
Method of Research
Subject of Research
Virtual Maps for Autonomous Exploration of Cluttered Underwater Environments