Next-generation phenotyping robot brings AI-driven insight to crop growth and stress response
en-GBde-DEes-ESfr-FR

Next-generation phenotyping robot brings AI-driven insight to crop growth and stress response

23/10/2025 TranSpread

Equipped with RGB, hyperspectral, and depth sensors, the robot can autonomously navigate crop fields, capturing and analyzing data with exceptional accuracy. PhenoRob-F achieved impressive results in detecting wheat ears, segmenting rice panicles, reconstructing 3D plant structures, and classifying drought severity in rice with over 99% accuracy.

To meet the global challenge of increasing food production under climate change, plant breeders require reliable phenotypic data linking genes to observable traits such as growth, yield, and stress tolerance. Traditional manual measurements are labor-intensive and prone to error, while controlled-environment phenotyping systems fail to capture field variability. Aerial systems such as drones offer speed but lack payload and resolution, and fixed gantry systems are expensive and immobile. Autonomous mobile robots bridge these gaps with their flexible mobility, high-resolution imaging, and minimal soil disturbance. However, existing robots have struggled to balance precision, stability, and scalability under field conditions. Based on these challenges, researchers designed PhenoRob-F to deliver robust, high-throughput phenotyping across multiple crops and environments.

A study (DOI: 10.1016/j.plaphe.2025.100085) published in Plant Phenomics on 13 August 2025 by Peng Song’s team, Huazhong Agricultural University, provides a powerful tool for plant breeders and agricultural researchers, enabling high-throughput, precise, and automated data acquisition that accelerates genetic discovery and crop improvement under real-world field conditions.

To evaluate the performance of PhenoRob-F under real-world conditions, the research team conducted three field experiments using multiple sensing and modeling techniques. The first experiment focused on RGB image acquisition for wheat and rice during the heading stage, where top-view canopy images were captured and analyzed using the YOLOv8m and SegFormer_B0 deep learning models. These enabled accurate detection of wheat ears and segmentation of rice panicles for yield estimation. The robot achieved a precision of 0.783, a recall of 0.822, and a mean average precision (mAP) of 0.853 for wheat, while rice panicle segmentation reached a mean intersection over union (mIoU) of 0.949 and an accuracy of 0.987, demonstrating robust visual performance. The second experiment employed an RGB-D depth camera to reconstruct the 3D structures of maize and rapeseed plants across growth stages. Using the scale-invariant feature transform (SIFT) and iterative closest point (ICP) algorithms, the robot generated high-fidelity point clouds for estimating plant height, achieving strong correlations with manual measurements (R² = 0.99 for maize and 0.97 for rapeseed). The third experiment applied hyperspectral imaging to rice under drought stress, collecting spectral data in the 900–1700 nm range to classify drought severity. After feature extraction and reduction via the CARS algorithm, a random forest model achieved classification accuracies ranging from 97.7% to 99.6% across five drought levels. Operationally, PhenoRob-F demonstrated high efficiency, completing phenotyping rounds in 2–2.5 hours and processing up to 1875 potted plants per hour. These experiments collectively confirmed the robot’s capability to autonomously collect multimodal data, integrate spectral and 3D imaging, and deliver high-precision phenotypic trait analysis across diverse crop species.

PhenoRob-F offers a practical, cost-effective solution for field-based phenotyping, providing researchers and breeders with an automated means to evaluate crop performance across diverse conditions. The system can assist in yield prediction, stress monitoring, and genetic screening, ultimately supporting the development of climate-resilient and high-yield crop varieties. Beyond breeding, its hyperspectral and 3D imaging capabilities could be extended to monitor soil health, nutrient management, and pest detection. By significantly reducing the labor and time required for data collection, PhenoRob-F accelerates the transition from genomic data to field application—bridging a critical gap in modern agriculture’s digital transformation.

###

References

DOI

10.1016/j.plaphe.2025.100085

Original URL

https://doi.org/10.1016/j.plaphe.2025.100085

Funding information

This work was supported by the National Key Research and Development Program of China (2021YFD1200504, 2022YFD2002304), the National Natural Science Foundation of China (32471992), the Key Core Technology Project in Agriculture of Hubei Province (HBNYHXGG2023-9), and the Supporting Project for High-Quality Development of the Seed Industry of Hubei Province (HBZY2023B001-06).

About Plant Phenomics

Plant Phenomics is dedicated to publishing novel research that will advance all aspects of plant phenotyping from the cell to the plant population levels using innovative combinations of sensor systems and data analytics. Plant Phenomics aims also to connect phenomics to other science domains, such as genomics, genetics, physiology, molecular biology, bioinformatics, statistics, mathematics, and computer sciences. Plant Phenomics should thus contribute to advance plant sciences and agriculture/forestry/horticulture by addressing key scientific challenges in the area of plant phenomics.

Title of original paper: PhenoRob-F: An autonomous ground-based robot for high-throughput phenotyping of field crops
Authors: Meng Yang a b 1, Zhengda Li a b c 1, Jiale Cui a b, Yang Shao a b, Ruifang Zhai d, Wen Qiao c, Wanneng Yang a b, Peng Song a b
Journal: Plant Phenomics
Original Source URL: https://doi.org/10.1016/j.plaphe.2025.100085
DOI: 10.1016/j.plaphe.2025.100085
Latest article publication date: 13 August 2025
Subject of research: Not applicable
COI statement: The authors declare that they have no competing interests.
Archivos adjuntos
  • Figure 3 Phenotyping module system architecture and data flow.
23/10/2025 TranSpread
Regions: North America, United States, Asia, China
Keywords: Applied science, Engineering

Disclaimer: AlphaGalileo is not responsible for the accuracy of content posted to AlphaGalileo by contributing institutions or for the use of any information through the AlphaGalileo system.

Testimonios

We have used AlphaGalileo since its foundation but frankly we need it more than ever now to ensure our research news is heard across Europe, Asia and North America. As one of the UK’s leading research universities we want to continue to work with other outstanding researchers in Europe. AlphaGalileo helps us to continue to bring our research story to them and the rest of the world.
Peter Dunn, Director of Press and Media Relations at the University of Warwick
AlphaGalileo has helped us more than double our reach at SciDev.Net. The service has enabled our journalists around the world to reach the mainstream media with articles about the impact of science on people in low- and middle-income countries, leading to big increases in the number of SciDev.Net articles that have been republished.
Ben Deighton, SciDevNet
AlphaGalileo is a great source of global research news. I use it regularly.
Robert Lee Hotz, LA Times

Trabajamos en estrecha colaboración con...


  • e
  • The Research Council of Norway
  • SciDevNet
  • Swiss National Science Foundation
  • iesResearch
Copyright 2025 by DNN Corp Terms Of Use Privacy Statement