News Release

SwRI develops off-road autonomous driving tools focused on camera vision

Researchers develop powerful vision-based autonomy alternative to lidar, radar, GPS

Business Announcement

Southwest Research Institute



SwRI is exploring using stereo cameras, or stereovision, as an alternative to lidar sensors in automated vehicles. SwRI’s stereovision algorithms create disparity maps that estimate the depth of roadway features and obstacles. The left image shows how a conventional camera sees an off-road trail. The middle image shows a lidar image of the same trail. The right image shows a stereovision disparity map based on SwRI’s algorithms, where colors indicate the distance of detected objects (yellow is near and blue is far). The gray/white in the lidar image suggests the outline of trees and a vehicle hood, but it does not indicate depth or distance.

view more 

Credit: Southwest Research Institute

SAN ANTONIO — March 12, 2024 — Southwest Research Institute has developed off-road autonomous driving tools with a focus on stealth for the military and agility for space and agriculture clients. The vision-based system pairs stereo cameras with novel algorithms, eliminating the need for lidar and active sensors.

“We reflected on the toughest machine vision challenges and then focused on achieving dense, robust modeling for off-road navigation,” said Abe Garza, a research engineer in SwRI’s Intelligent Systems Division.

Through internal research, SwRI engineers developed a suite of tools known as the Vision for Off-road Autonomy (VORA). The passive system can perceive objects, model environments and simultaneously localize and map while navigating off-road environments.

The VORA team envisioned a camera system as a passive sensing alternative to lidar, a light detection and ranging sensor that emits active lasers to probe objects and calculate depth and distance. Though highly reliable, lidar sensors produce light that can be detected by hostile forces. Radar, which emits radio waves, is also detectable. GPS navigation can be jammed, and its signals are often blocked in canyons and mountains, which can limit agricultural automation.

“For our defense clients, we wanted to develop better passive sensing capabilities but discovered that these new computer vision tools could benefit agriculture and space research,” said Meera Towler, an SwRI assistant program manager who led the project.

The researchers developed the VORA technology to explore planetary surfaces. In space applications, autonomous robots are limited by power, payload capacity and intermittent connectivity. In space, cameras make more sense than power-hungry lidar systems.

To overcome various challenges, the team developed new software to use stereo camera data for high-precision tasks traditionally accomplished by using lidar. These tasks include localization, perception, mapping and world modeling.

Based on this research, SwRI developed a deep learning stereo matcher (DLSM) tool, which uses a recurrent neural network to create dense, accurate disparity maps from stereovision. A disparity map highlights motion differences between two stereo images.

To aid in simultaneous localization and mapping, SwRI developed a factor graph algorithm to intelligently combine sparse data from stereo image features, landmarks, inertial measurement unit (IMU) readings and wheel encoders to produce highly accurate localization data. Autonomous systems use factor graphs, or probabilistic graphical models, to make inferences by comparing variables.

“We apply our autonomy research to military and commercial vehicles, agriculture applications and so much more,” Towler said. “We are excited to show our clients a plug-and-play stereo camera solution integrated into an industry-leading autonomy stack.”

SwRI plans to integrate VORA technology into other autonomy systems and test it on an off-road course at SwRI’s San Antonio campus.

SwRI has made safety and security a priority in the development of autonomous vehicles and automated driving systems as the technology reaches advanced levels of readiness for commercial and governmental use.

To learn more about this project, visit

Disclaimer: AAAS and EurekAlert! are not responsible for the accuracy of news releases posted to EurekAlert! by contributing institutions or for the use of any information through the EurekAlert system.