News Release

Smart vision predicts wheat flowering days in advance using AI and weather data

Peer-Reviewed Publication

Nanjing Agricultural University The Academy of Science

By integrating RGB images with meteorological data and applying few-shot learning techniques, the system achieves an F1 score above 0.8 across different planting environments.

Wheat (Triticum aestivum) is a cornerstone of global food security, and predicting its phenological stages, especially anthesis, is critical for optimizing breeding strategies and improving yields. Conventional anthesis prediction models rely on genetic markers or environmental variables such as temperature and photoperiod, successfully estimating flowering dates at the field scale. However, these models fail to capture the micro-environmental variations influencing individual plants. For breeders, timely prediction—typically 8–10 days in advance—is essential for hybrid pollination. Moreover, regulatory agencies in the United States and Australia mandate accurate anthesis reporting 7–14 days before flowering in biotechnology trials. Current manual monitoring is costly, inefficient, and prone to human error. Based on these challenges, developing an automated, adaptable, and accurate method for predicting individual plant flowering became imperative.

study (DOI: 10.1016/j.plaphe.2025.100091) published in Plant Phenomics on 21 July 2025 by Yiting Xie’s & Huajian Liu’s team, University of Adelaide, offers a cost-effective, scalable, and precise tool for wheat breeders and regulatory bodies, transforming the traditionally labor-intensive task of tracking flowering into a smart and automated process.

This study developed a multimodal machine vision framework that integrates RGB imagery and on-site meteorological data to predict the anthesis of individual wheat plants. The model reformulates flowering prediction into binary or three-class classification problems, determining whether a plant will flower before, after, or within one day of a critical date. To improve adaptability and minimize data demands, few-shot learning based on metric similarity was introduced, enabling models trained on one dataset to generalize effectively to new environments. The research employed advanced architectures, Swin V2 and ConvNeXt, each paired with fully connected (FC) or transformer (TF) comparators. A multi-step evaluation process—including statistical profiling, cross-dataset validation, few-shot inference, ablation on weather integration, and anchor-transfer tests—demonstrated both model robustness and environmental sensitivity. Statistical analysis revealed clear climatic impacts on flowering duration, ranging from 18.4 days in early sowing to 11.6 days in late sowing, with ANOVA (P ≤ 0.001) confirming significant differences across conditions. Cross-dataset validation achieved F1 scores above 0.85 on training datasets and around 0.80 across independent datasets, indicating strong generalization. Few-shot inference improved accuracy further: one-shot models achieved F1 = 0.984 at 8 days before anthesis, while five-shot training raised weaker results (e.g., 0.75 → 0.889). Integrating weather data boosted accuracy by 0.06–0.13 F1 units, particularly 12–16 days before anthesis when image cues were weak. Anchor-transfer experiments verified model deployability, as Late-derived anchors yielded comparable performance (F1 ≈ 0.76) at new field sites, demonstrating that environmental alignment was more critical than dataset size. Even under the more complex three-class prediction, models retained F1 > 0.6, confirming the framework’s robustness and practical potential for high-precision flowering prediction in wheat breeding.

This multimodal AI system provides breeders with a reliable decision-support tool to plan hybridization and manage pollination windows more efficiently. For genetically modified (GM) crop trials, it can ensure compliance with regulatory frameworks by forecasting flowering in advance, thereby reducing costs and manual inspection frequency. By merging visual phenotyping with weather analysis, this method bridges the gap between static imaging and dynamic environmental modeling, marking a significant step toward intelligent, automated phenology prediction in precision agriculture.

###

References

DOI

10.1016/j.plaphe.2025.100091

Original URL

https://doi.org/10.1016/j.plaphe.2025.100091

Funding information

This work was supported by the ARC Training Centre for Accelerated Future Crops Development IC210100047), South Australian Research and Development Institute and the University of Adelaide Research Scholarships.

About Plant Phenomics

Plant Phenomics is dedicated to publishing novel research that will advance all aspects of plant phenotyping from the cell to the plant population levels using innovative combinations of sensor systems and data analytics. Plant Phenomics aims also to connect phenomics to other science domains, such as genomics, genetics, physiology, molecular biology, bioinformatics, statistics, mathematics, and computer sciences. Plant Phenomics should thus contribute to advance plant sciences and agriculture/forestry/horticulture by addressing key scientific challenges in the area of plant phenomics.


Disclaimer: AAAS and EurekAlert! are not responsible for the accuracy of news releases posted to EurekAlert! by contributing institutions or for the use of any information through the EurekAlert system.