News Release

CornPheno: A game-changer in corn breeding with smartphone-based phenotyping

Peer-Reviewed Publication

Nanjing Agricultural University The Academy of Science

Figure 4. Overall architecture of PET for corn ear kernel localization.

image: 

First, a Convolutional Neural Network (CNN) backbone is used to extract the image features. Then, a transformer encoder with progressive rectangle window attention is applied to these features to encode contextual information. Next, a quadtree splitter takes sparse querying points and the encoded features as input and outputs a point-query quadtree. After that, a transformer decoder decodes these point queries in parallel, with attention computed within a local window. Finally, these point queries are passed through a prediction head to obtain kernel predictions, i.e., whether it is “no kernel” or “kernel” along with its probability and localization.

view more 

Credit: The authors

This user-friendly and cost-effective solution provides a practical way for corn breeders to measure key ear traits—such as kernels per ear, rows per ear, and kernels per row—directly in the field. By leveraging advanced AI technologies, CornPheno overcomes the challenges of traditional phenotyping methods, which are often labor-intensive, error-prone, and require expensive equipment. The smartphone-based platform enables breeders to instantly capture and analyze corn ear traits in complex, outdoor environments, making it an invaluable tool for accelerating the breeding process and improving efficiency in agricultural research.

Corn is one of the world's most important crops, critical for food, feed, and industrial applications. In 2023, corn production in China alone accounted for 41% of total crop production, highlighting its significance globally. As part of the breeding process, phenotyping of key corn traits such as kernels per ear and rows per ear is crucial for selecting superior varieties. Traditional phenotyping methods, however, are labor-intensive, error-prone, and hindered by the need for costly and fixed-image capturing environments. Recent advances in AI and computer vision have aimed to automate and streamline this process, but many solutions remain costly and limited by specific environmental requirements. CornPheno addresses these gaps by offering a portable, cost-effective solution for breeders to collect data directly from the field.

study (DOI: 10.1016/j.plaphe.2025.100129) published in Plant Phenomics on 15 October 2025 by Hao Lu’s team, Huazhong University of Science and Technology, enables the accurate measurement of key corn ear traits, including kernels per ear, rows per ear, and kernels per row, using just a smartphone, significantly enhancing field productivity and breeding decision-making.

To validate the effectiveness of CornPheno, the research team compared it with five well-established plant counting models: BCNet, CSRNet, P2PNet, TasselNetV2, and CCTrans, using evaluation metrics such as Mean Absolute Error (MAE), Mean Squared Error (MSE), and the coefficient of determination (R2). CornPheno integrates two core technologies: the CornPET model for kernel detection and the Unicorn model for row detection. CornPET, based on crowd localization techniques, accurately localizes kernels in dense corn ears, even in complex field environments, while Unicorn handles challenges like row distortion, missing kernels, and misalignment. The system was tested in controlled machine, indoor, and field environments, and the results showed that CornPheno outperformed other models, achieving an R2 of 0.7641 for kernels per ear, significantly higher than the next best model, P2PNet (R2 = 0.6884). In field conditions, it maintained robust performance with an R2 of 0.8190, slightly lower than its controlled setting of 0.8322. For rows per ear, CornPheno correctly detected rows 80.3% of the time, with an MAE of 0.234, demonstrating its reliability. Additionally, it achieved an F1-score of 0.9418 for kernels per row, with high precision (0.9796) and recall (0.9091), further validating its robustness. The system is integrated into OpenPheno, a WeChat-based mini-program, allowing breeders to use smartphones for easy in-field phenotyping. The research team also developed the CKC-Wild dataset, which includes 1,727 images of corn ears captured in various field environments, annotated with precise kernel locations. CornPheno’s performance on this dataset demonstrates its ability to accurately count kernels and detect rows under diverse conditions, making it a practical and efficient tool for large-scale corn breeding.

CornPheno is a pioneering step towards revolutionizing corn phenotyping with an affordable, smartphone-based solution. It enables quick and accurate trait analysis directly in the field, providing breeders with essential data to make informed decisions. By overcoming the limitations of traditional phenotyping methods, CornPheno accelerates breeding cycles, reduces costs, and contributes to the global effort to improve crop yields and resilience.

###

References

DOI

10.1016/j.plaphe.2025.100129

Original Source URl

https://doi.org/10.1016/j.plaphe.2025.100129

Funding information

This work is jointly supported by the HUST Undergraduate Natural Science Foundation under Grant No. 62500034 and PhenoTrait Foundation.

About Plant Phenomics

Plant Phenomics is dedicated to publishing novel research that will advance all aspects of plant phenotyping from the cell to the plant population levels using innovative combinations of sensor systems and data analytics. Plant Phenomics aims also to connect phenomics to other science domains, such as genomics, genetics, physiology, molecular biology, bioinformatics, statistics, mathematics, and computer sciences. Plant Phenomics should thus contribute to advance plant sciences and agriculture/forestry/horticulture by addressing key scientific challenges in the area of plant phenomics.


Disclaimer: AAAS and EurekAlert! are not responsible for the accuracy of news releases posted to EurekAlert! by contributing institutions or for the use of any information through the EurekAlert system.