Equipped with RGB, hyperspectral, and depth sensors, the robot can autonomously navigate crop fields, capturing and analyzing data with exceptional accuracy. PhenoRob-F achieved impressive results in detecting wheat ears, segmenting rice panicles, reconstructing 3D plant structures, and classifying drought severity in rice with over 99% accuracy.
To meet the global challenge of increasing food production under climate change, plant breeders require reliable phenotypic data linking genes to observable traits such as growth, yield, and stress tolerance. Traditional manual measurements are labor-intensive and prone to error, while controlled-environment phenotyping systems fail to capture field variability. Aerial systems such as drones offer speed but lack payload and resolution, and fixed gantry systems are expensive and immobile. Autonomous mobile robots bridge these gaps with their flexible mobility, high-resolution imaging, and minimal soil disturbance. However, existing robots have struggled to balance precision, stability, and scalability under field conditions. Based on these challenges, researchers designed PhenoRob-F to deliver robust, high-throughput phenotyping across multiple crops and environments.
A study (DOI: 10.1016/j.plaphe.2025.100085) published in Plant Phenomics on 13 August 2025 by Peng Song’s team, Huazhong Agricultural University, provides a powerful tool for plant breeders and agricultural researchers, enabling high-throughput, precise, and automated data acquisition that accelerates genetic discovery and crop improvement under real-world field conditions.
To evaluate the performance of PhenoRob-F under real-world conditions, the research team conducted three field experiments using multiple sensing and modeling techniques. The first experiment focused on RGB image acquisition for wheat and rice during the heading stage, where top-view canopy images were captured and analyzed using the YOLOv8m and SegFormer_B0 deep learning models. These enabled accurate detection of wheat ears and segmentation of rice panicles for yield estimation. The robot achieved a precision of 0.783, a recall of 0.822, and a mean average precision (mAP) of 0.853 for wheat, while rice panicle segmentation reached a mean intersection over union (mIoU) of 0.949 and an accuracy of 0.987, demonstrating robust visual performance. The second experiment employed an RGB-D depth camera to reconstruct the 3D structures of maize and rapeseed plants across growth stages. Using the scale-invariant feature transform (SIFT) and iterative closest point (ICP) algorithms, the robot generated high-fidelity point clouds for estimating plant height, achieving strong correlations with manual measurements (R² = 0.99 for maize and 0.97 for rapeseed). The third experiment applied hyperspectral imaging to rice under drought stress, collecting spectral data in the 900–1700 nm range to classify drought severity. After feature extraction and reduction via the CARS algorithm, a random forest model achieved classification accuracies ranging from 97.7% to 99.6% across five drought levels. Operationally, PhenoRob-F demonstrated high efficiency, completing phenotyping rounds in 2–2.5 hours and processing up to 1875 potted plants per hour. These experiments collectively confirmed the robot’s capability to autonomously collect multimodal data, integrate spectral and 3D imaging, and deliver high-precision phenotypic trait analysis across diverse crop species.
PhenoRob-F offers a practical, cost-effective solution for field-based phenotyping, providing researchers and breeders with an automated means to evaluate crop performance across diverse conditions. The system can assist in yield prediction, stress monitoring, and genetic screening, ultimately supporting the development of climate-resilient and high-yield crop varieties. Beyond breeding, its hyperspectral and 3D imaging capabilities could be extended to monitor soil health, nutrient management, and pest detection. By significantly reducing the labor and time required for data collection, PhenoRob-F accelerates the transition from genomic data to field application—bridging a critical gap in modern agriculture’s digital transformation.
###
References
DOI
Original URL
https://doi.org/10.1016/j.plaphe.2025.100085
Funding information
This work was supported by the National Key Research and Development Program of China (2021YFD1200504, 2022YFD2002304), the National Natural Science Foundation of China (32471992), the Key Core Technology Project in Agriculture of Hubei Province (HBNYHXGG2023-9), and the Supporting Project for High-Quality Development of the Seed Industry of Hubei Province (HBZY2023B001-06).
About Plant Phenomics
Plant Phenomics is dedicated to publishing novel research that will advance all aspects of plant phenotyping from the cell to the plant population levels using innovative combinations of sensor systems and data analytics. Plant Phenomics aims also to connect phenomics to other science domains, such as genomics, genetics, physiology, molecular biology, bioinformatics, statistics, mathematics, and computer sciences. Plant Phenomics should thus contribute to advance plant sciences and agriculture/forestry/horticulture by addressing key scientific challenges in the area of plant phenomics.
Journal
Plant Phenomics
Method of Research
Experimental study
Subject of Research
Not applicable
Article Title
PhenoRob-F: An autonomous ground-based robot for high-throughput phenotyping of field crops
Article Publication Date
13-Aug-2025
COI Statement
The authors declare that they have no competing interests.