News Release

Fleets of drones could aid searches for lost hikers

System allows drones to cooperatively explore terrain under thick forest canopies where GPS signals are unreliable

Peer-Reviewed Publication

Massachusetts Institute of Technology

Exploration Drones

image: MIT researchers describe an autonomous system for a fleet of drones to collaboratively search under dense forest canopies using only onboard computation and wireless communication -- no GPS required. view more 

Credit: Melanie Gonick

Finding lost hikers in forests can be a difficult and lengthy process, as helicopters and drones can't get a glimpse through the thick tree canopy. Recently, it's been proposed that autonomous drones, which can bob and weave through trees, could aid these searches. But the GPS signals used to guide the aircraft can be unreliable or nonexistent in forest environments.

In a paper being presented at the International Symposium on Experimental Robotics conference next week, MIT researchers describe an autonomous system for a fleet of drones to collaboratively search under dense forest canopies. The drones use only onboard computation and wireless communication -- no GPS required.

Each autonomous quadrotor drone is equipped with laser-range finders for position estimation, localization, and path planning. As the drone flies around, it creates an individual 3-D map of the terrain. Algorithms help it recognize unexplored and already-searched spots, so it knows when it's fully mapped an area. An off-board ground station fuses individual maps from multiple drones into a global 3-D map that can be monitored by human rescuers.

In a real-world implementation, though not in the current system, the drones would come equipped with object detection to identify a missing hiker. When located, the drone would tag the hiker's location on the global map. Humans could then use this information to plan a rescue mission.

"Essentially, we're replacing humans with a fleet of drones to make the search part of the search-and-rescue process more efficient," says first author Yulun Tian, a graduate student in the Department of Aeronautics and Astronautics (AeroAstro).

The researchers tested multiple drones in simulations of randomly generated forests, and tested two drones in a forested area within NASA's Langley Research Center. In both experiments, each drone mapped a roughly 20-square-meter area in about two to five minutes and collaboratively fused their maps together in real-time. The drones also performed well across several metrics, including overall speed and time to complete the mission, detection of forest features, and accurate merging of maps.

Exploring and mapping

On each drone, the researchers mounted a LIDAR system, which creates a 2-D scan of the surrounding obstacles by shooting laser beams and measuring the reflected pulses. This can be used to detect trees; however, to drones, individual trees appear remarkably similar. If a drone can't recognize a given tree, it can't determine if it's already explored an area.

The researchers programmed their drones to instead identify multiple trees' orientations, which is far more distinctive. With this method, when the LIDAR signal returns a cluster of trees, an algorithm calculates the angles and distances between trees to identify that cluster. "Drones can use that as a unique signature to tell if they've visited this area before or if it's a new area," Tian says.

This feature-detection technique helps the ground station accurately merge maps. The drones generally explore an area in loops, producing scans as they go. The ground station continuously monitors the scans. When two drones loop around to the same cluster of trees, the ground station merges the maps by calculating the relative transformation between the drones, and then fusing the individual maps to maintain consistent orientations.

"Calculating that relative transformation tells you how you should align the two maps so it corresponds to exactly how the forest looks," Tian says.

In the ground station, robotic navigation software called "simultaneous localization and mapping" (SLAM) -- which both maps an unknown area and keeps track of an agent inside the area -- uses the LIDAR input to localize and capture the position of the drones. This helps it fuse the maps accurately.

The end result is a map with 3-D terrain features. Trees appear as blocks of colored shades of blue to green, depending on height. Unexplored areas are dark but turn gray as they're mapped by a drone. On-board path-planning software tells a drone to always explore these dark unexplored areas as it flies around. Producing a 3-D map is more reliable than simply attaching a camera to a drone and monitoring the video feed, Tian says. Transmitting video to a central station, for instance, requires a lot of bandwidth that may not be available in forested areas.

More efficient searching

A key innovation is a novel search strategy that let the drones more efficiently explore an area. According to a more traditional approach, a drone would always search the closest possible unknown area. However, that could be in any number of directions from the drone's current position. The drone usually flies a short distance, and then stops to select a new direction.

"That doesn't respect dynamics of drone [movement]," Tian says. "It has to stop and turn, so that means it's very inefficient in terms of time and energy, and you can't really pick up speed."

Instead, the researchers' drones explore the closest possible area while considering their speed and direction and maintaining a consistent velocity. This strategy -- where the drone tends to travel in a spiral pattern -- covers a search area much faster. "In search and rescue missions, time is very important," Tian says.

In the paper, the researchers compared their new search strategy with a traditional method. Compared to that baseline, the researchers' strategy helped the drones cover significantly more area, several minutes faster and with higher average speeds.

One limitation for practical use is that the drones still must communicate with an off-board ground station for map merging. In their outdoor experiment, the researchers had to set up a wireless router that connected each drone and the ground station. In the future, they hope to design the drones to communicate wirelessly when approaching one another, fuse their maps, and then cut communication when they separate. The ground station, in that case, would only be used to monitor the updated global map.

###

Co-authors on the paper are: Katherine Liu, a PhD student in MIT's Computer Science and Artificial Intelligence Laboratory (CSAIL) and AeroAstro; Kyel Ok, a PhD student in CSAIL and the Department of Electrical Engineering and Computer Science; Loc Tran and Danette Allen of the NASA Langley Research Center; Nicholas Roy, an AeroAstro professor and CSAIL researcher; and Jonathan P. How, the Richard Cockburn Maclaurin Professor of Aeronautics and Astronautics.

Written by Rob Matheson, MIT News Office

Related links

ARCHIVE: Programming drones to fly in the face of uncertainty

http://news.mit.edu/2018/mit-csail-programming-drones-fly-face-uncertainty-0212

ARCHIVE: Robots collaborate to deliver meds, supplies, and even drinks

http://news.mit.edu/2015/csail-delivery-robots-collaborate-0810

ARCHIVE: Delivery by drone

http://news.mit.edu/2014/algorithm-monitors-delivery-drones-0821

ARCHIVE: Robots in reality

http://news.mit.edu/2011/profile-roy-1128


Disclaimer: AAAS and EurekAlert! are not responsible for the accuracy of news releases posted to EurekAlert! by contributing institutions or for the use of any information through the EurekAlert system.