News Release

New AI-powered 3D tool enables fast, label-free phenotyping in rice and wheat

Peer-Reviewed Publication

Nanjing Agricultural University The Academy of Science

By combining radiance-field reconstruction with SAM2 segmentation, IPENS enables users to obtain precise organ-level geometry from ordinary multi-view images using only a few prompts. In rice and wheat, the system accurately measures voxel volume, leaf surface area, and leaf dimensions, operating non-destructively and at high speed.

Plant phenotyping technologies underpin the development of genotype-phenotype association models and guide trait improvement in modern breeding. Traditional 2D imaging methods struggle to capture complex plant structures, while field phenotyping often requires manual sampling and destructive testing. Recent advances in 3D reconstruction—including Neural Radiance Fields (NeRF) and 3D Gaussian Splatting—have demonstrated strong potential for non-invasive trait evaluation, but most models require large annotated datasets, perform poorly on occluded organs like rice grains, or demand repetitive user interaction per target. Unsupervised approaches lack precision at grain-scale resolution, and multi-target segmentation remains inefficient.

study (DOI: 10.1016/j.plaphe.2025.100106) published in Plant Phenomics on 15 September 2025 by Youqiang Sun’s team, Chinese Academy of Sciences, provides researchers and breeders with rapid, reliable phenotypic data to accelerate intelligent breeding and improve crop productivity.

To evaluate the performance of IPENS, the researchers first designed a quantitative segmentation experiment using MMR (rice) and MMW (wheat) datasets, where 30% of the data served as a validation set and the remaining portion was used for comparative algorithm training. The segmentation task was conducted by manually placing two positive and two negative prompts on both the first and rear video frames, allowing the model to perform unsupervised 3D instance segmentation based on prompt guidance. Segmentation quality was assessed using IoU, precision, recall, and F1 score, and results were compared with existing mainstream algorithms including the unsupervised CrossPoint, the supervised interactive Agile3D, and the fully supervised state-of-the-art oneformer3D. A time-performance evaluation was also conducted by measuring segmentation time for single- and multi-target scenarios, benchmarking IPENS against SA3D and analyzing how efficiency scales with target quantity. Beyond segmentation, phenotypic accuracy was verified through voxel volume estimation and leaf-trait measurement, examining how multi-stage point cloud processing (convex hull → mesh → mesh subdivision) influences error and model stability. Results show that IPENS achieved IoU scores of 61.48%, 69.54%, and 60.13% for rice grain, leaf, and stem, and 92.82%, 86.47%, and 89.76% for wheat panicle, leaf, and stem, respectively. Its mean IoU surpassed the unsupervised CrossPoint (Rice 23.41% / Wheat 16.50%) and exceeded Agile3D’s first-interaction performance, demonstrating competitive accuracy without labeled data. Time analysis revealed ~3.3× acceleration compared with SA3D, with single-organ segmentation taking ~70 seconds and multi-organ inference scaling linearly with target number. Trait estimation further confirmed model reliability, with rice grain voxel volume reaching R²=0.7697 (RMSE 0.0025) and wheat panicle voxel volume R²=0.9956. Leaf area accuracy improved progressively after subdivision (rice R²=0.84; wheat R²=1.00), and leaf length/width estimation maintained millimeter-level errors (rice R²=0.97/0.87; wheat R²=0.99/0.92), validating that higher segmentation quality directly supports stable phenotypic prediction.

IPENS offers a scalable, non-invasive and label-free tool for field and greenhouse phenotyping. By rapidly generating accurate 3D trait data, it provides breeding programs with efficient support for yield-related evaluations, genomic selection, and organ-level trait screening. The method improves throughput for grain counting, biomass measurement, and plant architecture assessment while reducing reliance on expert annotators. With strong cross-species generalization demonstrated in rice, wheat, and other crops, IPENS has potential for integration into automated phenotyping chambers, robotic imaging platforms, and future smart-agriculture pipelines. Its capacity to link phenotype data to genomic models may significantly accelerate trait improvement and breeding decision-making.

###

References

DOI

10.1016/j.plaphe.2025.100106

Original Source URl

https://doi.org/10.1016/j.plaphe.2025.100106

Funding information

This research was supported by the National Key Research and Development Program of China (Grant Number 2023YFD1901003) and the Strategic Priority Research Program of the Chinese Academy of Sciences (Grant XDA28120402).

About Plant Phenomics

Plant Phenomics is dedicated to publishing novel research that will advance all aspects of plant phenotyping from the cell to the plant population levels using innovative combinations of sensor systems and data analytics. Plant Phenomics aims also to connect phenomics to other science domains, such as genomics, genetics, physiology, molecular biology, bioinformatics, statistics, mathematics, and computer sciences. Plant Phenomics should thus contribute to advance plant sciences and agriculture/forestry/horticulture by addressing key scientific challenges in the area of plant phenomics.


Disclaimer: AAAS and EurekAlert! are not responsible for the accuracy of news releases posted to EurekAlert! by contributing institutions or for the use of any information through the EurekAlert system.