News Release

Integrating light and structure: Smarter mapping for fragile wetland ecosystems

Peer-Reviewed Publication

Journal of Remote Sensing

Overview of the study area.

image: 

Overview of the study area. (A) UAV-LiDAR point cloud. (B) UAV hyperspectral images. (C to F) Field measurements.

view more 

Credit: Journal of Remote Sensing

Accurate classification of wetland vegetation is essential for biodiversity conservation and carbon cycle monitoring. This study developed an adaptive ensemble learning (AEL-Stacking) framework that combines hyperspectral and light detection and ranging (LiDAR) data captured by UAVs to precisely identify vegetation species in karst wetlands. The approach achieved up to 92.77% accuracy—substantially outperforming traditional models—and revealed how spectral and structural features jointly improve ecosystem mapping and restoration strategies.

Karst wetlands are globally significant ecosystems that regulate water, store carbon, and harbor rich biodiversity. However, the intricate vegetation composition and similar canopy spectra among species hinder accurate remote sensing classification. Traditional field surveys are costly and spatially limited, while multispectral imaging lacks sufficient spectral resolution for species-level mapping. Light detection and ranging (LiDAR) provides 3D structural data but struggles with water-surface reflectance and weak signals. Due to these challenges, integrating complementary optical and structural data is necessary to conduct in-depth research on precise vegetation species classification in karst wetlands.

Researchers from the Guilin University of Technology and collaborators published (DOI: 10.34133/remotesensing.0452) their findings in Journal of Remote Sensing on October 16, 2025. The study introduces a UAV-based approach that merges hyperspectral imagery (HSI) and LiDAR point-cloud data through an adaptive ensemble learning stacking (AEL-Stacking) model. This innovative framework not only enhances classification accuracy but also uses local interpretable model-agnostic explanations (LIME) to visualize how each feature contributes to the decision-making process—offering both high precision and interpretability in mapping complex wetland vegetation structures.

The study demonstrated that combining HSI and LiDAR data achieved the highest overall accuracy (87.91%–92.77%), surpassing single-data approaches by up to 9.5%. The AEL-Stacking model, integrating Random Forest, LightGBM, and CatBoost classifiers, outperformed both conventional ensemble and deep-learning (Swin Transformer) algorithms by 0.96%–7.58%. The LiDAR features—especially digital surface model (DSM) variables—were pivotal for distinguishing species with distinct vertical structures, while hyperspectral vegetation indices such as NDVI and blue-edge parameters enhanced recognition of herbaceous species. These results highlight the synergy between optical and structural data in resolving species with overlapping spectral signatures.

Field surveys were conducted in the Huixian Karst Wetland of Guilin, China, one of the country's largest karst wetlands. UAV flights equipped with Headwall Nano-Hyperspec and DJI Zenmuse L1 LiDAR sensors collected over 4,500 hyperspectral images and dense point clouds (208 points/m²). The integrated dataset covered 13 vegetation types, including lotus, miscanthus, and camphor trees. Through recursive feature elimination and correlation analysis, 40 optimal features were selected from more than 600 variables. The AEL-Stacking framework adaptively tuned hyperparameters, selected the best-performing base learner as the meta-model, and validated results using 10-fold cross-validation. LIME analysis revealed DSM and blue spectral bands as the most influential features, with Lotus and Miscanthus achieving classification F1-scores above 0.9. The model significantly reduced misclassification between morphologically similar species, offering detailed vegetation maps critical for ecosystem monitoring.

"Our approach bridges the gap between spectral and structural sensing," said Dr. Bolin Fu, corresponding author. "By combining UAV hyperspectral and LiDAR data through adaptive ensemble learning, we achieved both precision and interpretability in vegetation mapping. The framework not only improves species recognition in complex karst environments but also provides a generalizable tool for ecological monitoring and habitat restoration worldwide".

The team developed the AEL-Stacking model by combining Random Forest, LightGBM, and CatBoost classifiers under a grid-search-optimized adaptive framework. The model used 70% of data for training and 30% for testing, supported by 10-fold cross-validation. Hyperspectral features (e.g., NDVI, EVI, CIg) and LiDAR-derived metrics (e.g., DSM, intensity skewness) were fused into a multidimensional dataset. To interpret results, the LIME algorithm quantified the contribution of each feature and visualized how data variations influenced species classification across multiple vegetation types.

This integrative framework demonstrates a scalable and explainable approach for high-resolution wetland mapping, potentially applicable to forest, grassland, and coastal ecosystems. Future work will focus on integrating multi-temporal UAV observations and satellite data fusion to monitor seasonal vegetation dynamics and climate-driven changes in wetland health. By enhancing the transparency and accuracy of AI-driven ecological models, this research paves the way for smarter environmental management and supports the global agenda for biodiversity conservation and carbon neutrality.

###

References

DOI

10.34133/remotesensing.0452

Original Source URL

https://spj.science.org/doi/10.34133/remotesensing.0452

Funding information

This study was supported by the National Natural Science Foundation of China (grant number 42371341), the Natural Science Foundation of Guangxi Zhuang Autonomous Region (grant number 2024GXNSFAA010351), the Innovation Project of Guangxi Graduate Education (grant number YCBZ2024179), and the Key Laboratory of Tropical Marine Ecosystem and Bioresource, Ministry of Natural Resources (grant number 2023ZD02).

About Journal of Remote Sensing

The Journal of Remote Sensing, an online-only Open Access journal published in association with AIR-CAS, promotes the theory, science, and technology of remote sensing, as well as interdisciplinary research within earth and information science.


Disclaimer: AAAS and EurekAlert! are not responsible for the accuracy of news releases posted to EurekAlert! by contributing institutions or for the use of any information through the EurekAlert system.