Researchers at Huazhong Agricultural University have developed an autonomous field phenotyping robot, PhenoRob-F, designed to collect and analyze plant data using artificial intelligence and multimodal imaging technologies. The system, described in a peer-reviewed article published on 13 August 2025 in Plant Phenomics (DOI: 10.1016/j.plaphe.2025.100085), was created to address the limitations of manual, drone-based, and fixed phenotyping systems in crop research.
PhenoRob-F integrates RGB, hyperspectral, and depth sensors to navigate fields and gather data on plant growth and stress responses. In field trials, the robot was tested on wheat, rice, maize, and rapeseed to evaluate its performance in yield estimation, 3D reconstruction, and drought stress classification.
In wheat and rice experiments, the robot captured top-view canopy images and used deep learning models YOLOv8m and SegFormer_B0 for ear detection and panicle segmentation. The system achieved a mean average precision of 0.853 for wheat detection and a mean intersection over union of 0.949 with 98.7% accuracy for rice segmentation. In maize and rapeseed trials, the robot used RGB-D imaging and algorithms such as the scale-invariant feature transform and iterative closest point to reconstruct 3D plant structures, showing strong correlations with manual height measurements (R² = 0.99 for maize and 0.97 for rapeseed).
For drought stress classification in rice, PhenoRob-F employed hyperspectral imaging within the 900–1700 nm range. After data processing using the competitive adaptive reweighted sampling algorithm and a random forest model, the system achieved accuracy rates between 97.7% and 99.6% across five drought levels. The robot completed phenotyping cycles within 2–2.5 hours, processing up to 1,875 potted plants per hour.
According to the study, PhenoRob-F enables large-scale, automated, and high-precision phenotyping in field environments, supporting applications in yield prediction, stress assessment, and genetic analysis. The research team suggests that the platform may also be adapted for soil monitoring, nutrient management, and pest detection, contributing to data-driven approaches in agricultural breeding and crop management.
