Drones + “Segment Anything” automate peanut field phenotyping for faster breeding
en-GBde-DEes-ESfr-FR

Drones + “Segment Anything” automate peanut field phenotyping for faster breeding

27.12.2025 TranSpread

By combining unmanned aerial system (UAS) imagery with the Segment Anything Model (SAM) for plot segmentation and convolutional neural networks (CNNs) for trait classification, the pipeline delivers reliable, plot-level estimates of canopy height (CH), growth habit (GH), and mainstem prominence (MP).

Peanut is a major crop for food and trade, requiring continuous breeding gains in yield, quality, and stress resistance. Yet progress is often constrained by field phenotyping, which is labor-intensive and difficult to scale. Architectural traits such as canopy height, growth habit, and mainstem prominence are especially important for canopy structure, adaptation, and mechanical harvestability. While drones accelerate data collection, image processing and plot segmentation often remain semi-manual, particularly in heterogeneous fields with overlapping canopies or misaligned rows. Foundation vision models like SAM offer more generalizable segmentation with minimal training data, but agricultural scenes still pose challenges from clutter and shadows.

A study (DOI: 10.1016/j.plaphe.2025.100126) published in Plant Phenomics on 27 October 2025 by Peggy Ozias-Akins’s team, University of Georgia, demonstrates that a fully automated, SAM- and deep learning–based UAS phenotyping pipeline can reliably replace labor-intensive field measurements while preserving the genetic signals needed for large-scale peanut breeding and QTL analysis.

This study presents a fully automated high-throughput phenotyping (HTP) pipeline that integrates UAS-derived orthomosaics, the Segment Anything Model (SAM), digital surface modeling, and convolutional neural networks (CNNs) to extract plot-level architectural traits in peanut breeding trials. The workflow begins with SAM auto-mask generation to identify experimental field boundaries and estimate field orientation without manual input, followed by metric-based post-processing using stability scores and predicted IoU to select optimal masks. Individual plots are then delineated using SAM’s interactive mode with automatically generated multi-point prompts derived from temporary plot centroids, eliminating the need for hand-drawn plot polygons and enabling scalable plot extraction in heterogeneous fields. For canopy height (CH) estimation, local terrain points adjacent to each plot are sampled to reconstruct a plot-specific digital terrain model and normalized digital surface model (nDSM), allowing plant height to be isolated from terrain variability caused by furrows, wheel tracks, or raised beds. Plot-level CH estimates showed strong agreement with manual measurements (R² ≈ 0.78, RMSE < 3 cm, MAPE ≈ 10%), confirming effective terrain normalization. Growth habit (GH) and mainstem prominence (MP) were classified using pretrained CNNs (AlexNet, ResNet18, and EfficientNet-B0) trained on RGB imagery, nDSM data, or their combination. For GH, AlexNet achieved the highest accuracy (88.5%) when combining RGB and nDSM inputs, with balanced recall for Bunch and Spreading types, highlighting the value of integrating structural and visual information. In contrast, MP estimation relied primarily on structural cues, with nDSM-only models reaching accuracies of about 83%, while adding RGB provided little benefit. Automated field identification achieved IoU values mostly above 0.95 and orientation errors below 0.8°, and plot segmentation reached a specificity of 0.99, sensitivity of 0.87, and a Dice coefficient of 0.92, at roughly 2 seconds per plot. Finally, QTL analyses using HTP-derived GH and MP identified the same major loci on chromosome Arahy.15 as conventional phenotyping, demonstrating that the automated pipeline delivers accurate, efficient, and genetically informative phenotypes suitable for large-scale breeding applications.

A pipeline like this can help breeding teams phenotype large trials more frequently and consistently, enabling faster selection cycles and stronger datasets for QTL mapping, GWAS, and genomic selection. Automating plot segmentation and trait extraction can also improve reproducibility across years and locations, since the same rules are applied uniformly instead of relying on manual delineation or scorer-dependent ratings.

###

References

DOI

10.1016/j.plaphe.2025.100126

Original Source URl

https://doi.org/10.1016/j.plaphe.2025.100126

Funding information

This work was supported by the United States Department of Agriculture National Institute of Food and Agriculture [Award Number 2022-67013-37365].

About Plant Phenomics

Plant Phenomics is dedicated to publishing novel research that will advance all aspects of plant phenotyping from the cell to the plant population levels using innovative combinations of sensor systems and data analytics. Plant Phenomics aims also to connect phenomics to other science domains, such as genomics, genetics, physiology, molecular biology, bioinformatics, statistics, mathematics, and computer sciences. Plant Phenomics should thus contribute to advance plant sciences and agriculture/forestry/horticulture by addressing key scientific challenges in the area of plant phenomics.

Title of original paper: Aerial imagery and Segment Anything Model for architectural trait phenotyping to support genetic analysis in peanut breeding
Authors: Javier Rodriguez-Sanchez a b, Raissa Martins Da Silva c, Ye Chu c, Lenin Rodriguez c, Jing Zhang d, Kyle Johnsen a, Peggy Ozias-Akins c, Changying Li e
Journal: Plant Phenomics
Original Source URL: https://doi.org/10.1016/j.plaphe.2025.100126
DOI: 10.1016/j.plaphe.2025.100126
Latest article publication date: 27 October 2025
Subject of research: Not applicable
COI statement: The authors declare that they have no competing interests.
Angehängte Dokumente
  • Figure 4. SAM-based plot segmentation and terrain sampling. (A) Crop ranges detection based on vertical summation of vegetation pixels. Red triangles mark the midpoints of regions with minimal vegetation, corresponding to inter-row alleys. (B) Zoomed-in view of the area highlighted by the red rectangle in panel (A). Red triangles indicate peaks in accumulated vegetation, corresponding to crop rows. (C) Distribution of sampling points used for local terrain estimation. SAM-generated masks for individual rows are shown in pink and turquoise. Blue dash-dotted lines represent the detected crop rows. Green circles denote plot centroids used as prompts for SAM, and red circles indicate terrain sampling points.
27.12.2025 TranSpread
Regions: North America, United States
Keywords: Applied science, Engineering

Disclaimer: AlphaGalileo is not responsible for the accuracy of content posted to AlphaGalileo by contributing institutions or for the use of any information through the AlphaGalileo system.

Referenzen

We have used AlphaGalileo since its foundation but frankly we need it more than ever now to ensure our research news is heard across Europe, Asia and North America. As one of the UK’s leading research universities we want to continue to work with other outstanding researchers in Europe. AlphaGalileo helps us to continue to bring our research story to them and the rest of the world.
Peter Dunn, Director of Press and Media Relations at the University of Warwick
AlphaGalileo has helped us more than double our reach at SciDev.Net. The service has enabled our journalists around the world to reach the mainstream media with articles about the impact of science on people in low- and middle-income countries, leading to big increases in the number of SciDev.Net articles that have been republished.
Ben Deighton, SciDevNet
AlphaGalileo is a great source of global research news. I use it regularly.
Robert Lee Hotz, LA Times

Wir arbeiten eng zusammen mit...


  • e
  • The Research Council of Norway
  • SciDevNet
  • Swiss National Science Foundation
  • iesResearch
Copyright 2025 by DNN Corp Terms Of Use Privacy Statement