News Release

DeepForest AI unlocks hidden forest layers

Peer-Reviewed Publication

Journal of Remote Sensing

Forests hide critical ecological information beneath dense canopy layers, where traditional imaging techniques struggle to access. This research introduces a breakthrough approach that enables volumetric sensing of deep vegetation using standard aerial cameras combined with synthetic-aperture imaging and 3D neural networks. The method retrieves reflectance and vegetation-health indicators across entire forest volumes, offering a low-cost alternative to light detection and ranging (LiDAR) or radar for monitoring biomass, ecosystem health, and environmental change.

Monitoring forest health typically relies on remote sensing tools such as light detection and ranging (LiDAR), radar, and multispectral photography. While radar and LiDAR penetrate canopies to reveal structure, they struggle to provide fine-resolution spectral details needed for vegetation health assessment. Cameras offer high spatial resolution and multispectral flexibility, yet conventional imaging only captures top-layer reflectance, leaving under-canopy biomass invisible. This limitation restricts carbon estimation, biodiversity monitoring, early disturbance detection, and climate-impact assessments. Due to these challenges, new strategies are needed to sense deep vegetation layers and reconstruct spectral information throughout entire forest volumes.

Researchers from Johannes Kepler University Linz, Helmholtz Centre for Environmental Research, and Leipzig University have developed a novel imaging technology named DeepForest, published (DOI: 10.34133/remotesensing.0907) on 11 November 2025 in Journal of Remote Sensing. The work demonstrates how drones equipped with regular cameras, rather than expensive LiDAR or radar systems, can capture under-canopy vegetation using synthetic-aperture focal stacks enhanced by 3D convolutional neural networks. This addresses a long-standing problem in forest sensing—seeing beyond the top canopy while maintaining high spectral detail for ecological monitoring.

DeepForest reconstructs volumetric reflectance stacks of vegetation, revealing forest structure from canopy to understory. The approach improves deep-layer reflectance accuracy by 2–12 times, with an average ~7-fold correction, even in forests with up to 1680 trees/ha density. Unlike photogrammetry, which reconstructs only top layers, DeepForest recovers spectral cues across vertical forest segments and supports vegetation-index calculation such as Normalized difference vegetation index (NDVI). Field tests achieved MSE = 0.05 between real drone imagery and reconstructed upper layers, indicating strong reliability. The technology’s compatibility with standard multispectral cameras significantly reduces cost, enabling broad deployment for environmental monitoring, forestry, carbon accounting, and conservation.

The team scanned 30 × 30 m forest plots using a drone-mounted multispectral camera (green, red, red-edge, NIR) from 35 m altitude, sampling a 24 × 24 m synthetic-aperture grid (9×9 images). These images were computationally refocused into 440 focal slices, forming a 3D stack similar to microscope imaging. However, out-of-focus scattering from overlapping leaves produced noisy signals—resolved using depth-specific 3D CNNs trained on simulated procedural forest datasets. Over 11 million training samples and 4.6 million validation samples were used to teach networks how to remove occlusion noise. The corrected reflectance stacks were further calibrated with field-reconstructed canopy points, enabling accurate NDVI estimation. Volumetric NDVI visualization revealed subtle structural differences and biomass distribution, with 3D crop filters distinguishing healthy vegetation zones. Tests confirmed robust performance across varying tree densities, indicating scalability for real-world forest systems.

“DeepForest opens a new window into forest interiors,” the authors noted. “By enabling standard drones to sense beyond canopy surfaces, we offer ecologists access to previously unreachable data on biomass, biodiversity, and stress conditions. This could transform monitoring of climate change, fire recovery, and carbon dynamics at scale.”

Multispectral aerial images were collected using a DJI drone equipped with a Sequoia+ camera. Data were corrected radiometrically, downsampled, and processed using synthetic-aperture focal stacking. Each depth slice was fed into individual 3D convolutional neural networks, trained to suppress out-of-focus occlusion artifacts. Simulations used Gazebo-based procedural forests, generating ground-truth reflectance for training. NDVI volumes were computed from corrected red/NIR channels for vegetation-health analysis (equation NDVI = (NIR–Red)/(NIR+Red)).

DeepForest may enable large-scale forest digital twins, 3D biomass estimation, and climate-impact tracking with low-cost hardware. Future work will expand the model to more forest types, integrate LiDAR for void-point filtering, and improve under-canopy resolution for fine-scale branch/leaf mapping. Scaling through drone swarms or fast fixed-wing platforms could support carbon-offset verification, wildfire-risk surveillance, and tropical biodiversity monitoring. The technology positions camera-based sensing as a powerful alternative to current remote-sensing tools—opening the canopy to science.

###

References

DOI

10.34133/remotesensing.0907

Original Source URL

https://doi.org/10.34133/remotesensing.0907

Funding information

This study received funding from Linz Institute of Technology grant LIT-2022-11-SEE-112 (O.B.) and Austrian Science Fund (FWF), German Research Foundation (DFG) grant I 6046-N (O.B.).

About Journal of Remote Sensing

The Journal of Remote Sensing, an online-only Open Access journal published in association with AIR-CAS, promotes the theory, science, and technology of remote sensing, as well as interdisciplinary research within earth and information science.


Disclaimer: AAAS and EurekAlert! are not responsible for the accuracy of news releases posted to EurekAlert! by contributing institutions or for the use of any information through the EurekAlert system.