Dual-UAV system boosts color accuracy in crop remote sensing
en-GBde-DEes-ESfr-FR

Dual-UAV system boosts color accuracy in crop remote sensing

16/12/2025 TranSpread

Tests showed that this method improved color fidelity by over 70%, aligning image color with ground-truth measurements and enhancing rice maturity prediction accuracy from R² = 0.28 to 0.67. The technology offers a scalable pathway for reliable high-throughput phenotyping and breeding applications in precision agriculture.

Plant color reflects physiological status, nutrient levels, and biotic or abiotic stress, making it a key trait in breeding and field management. Traditional visual inspection is subjective and inefficient, while satellite imagery lacks resolution for fine phenotyping. UAV-based remote sensing fills this gap, but outdoor imaging suffers from inconsistent illumination, camera variability, and flight-to-flight differences. Past solutions—image feature transfer, reference image selection, deep learning enhancement—improved color consistency to some extent, but accuracy was limited and performance depended strongly on reference quality. ColorChecker chart correction is widely used in photography and printing, yet covering each UAV view is impractical in field missions. These limitations highlighted the need for a real-time, mobile, and scalable color standardization method for agricultural remote sensing.

A study (DOI: 10.1016/j.plaphe.2025.100101) published in Plant Phenomics on 5 September 2025 by Haiyan Cen’s team, Zhejiang University, demonstrates a reliable dual-UAV color correction method that significantly enhances the accuracy and consistency of field-scale RGB imagery, enabling more precise crop phenotyping and agricultural decision-making.

A evaluation was performed to verify the performance of the CoF-CC color correction method, beginning with tests across six different camera models in which images containing both the ColorChecker and rice fields were captured and corrected using a color correction matrix. Results showed that the method successfully reduced brightness and color discrepancies among original images, lowering the ΔE values of ColorChecker patches from 5.8–18.0 to 3.4–5.0, reflecting a 66.1% improvement in color accuracy. Color distributions plotted in LAB 3D space revealed tighter clusters after correction, and the intracluster distance decreased from 13.2 to 3.9, indicating a 70.2% enhancement in color consistency. Further accuracy assessment using large-scale field images confirmed these outcomes, with ΔE values across variable image sets dropping from 8.0–25.9 to 2.3–8.2, representing a 73.6% increase in accuracy. When color-cast images were corrected, artificial distortions were reduced to an average ΔE of 7.1. Comparisons with ground-truth measurements at 30 sampling points showed mean ΔE reduced from 18.2 to 5.0—a 72.7% improvement. The corrected images also enabled clearer visualization of ripening differences. To evaluate segmentation effectiveness, ground-truth leaf colors were mapped onto both original and corrected images. Original segmentation poorly matched actual leaf shapes, but corrected images aligned well with manual labeling, increasing overlap from 15.1% to 64.3%, demonstrating significantly improved target identification. Finally, the corrected orthomosaic allowed extraction of canopy color values for each plot, effectively differentiating breeding materials. While original image colors showed weak correlation with rice maturity (R² = 0.28), CoF-CC–derived colors achieved a strong linear relationship (R² = 0.67), confirming the method's value for phenotyping and maturity prediction.

Accurate field-scale color measurement plays a vital role in agricultural decision-making, supporting genotype evaluation in breeding nurseries, crop stress and nutrient assessment, harvest timing estimation, digital phenotyping, and large-scale monitoring for precision agriculture. By tightly leveraging the relationship between canopy color, chlorophyll content, and maturity, the CoF-CC pipeline ensures reliable extraction of color-based signals that are often masked by illumination variation, marking an important step toward standardized RGB datasets, improved data sharing across experiments, and scalable high-throughput phenotyping and automation in modern agriculture.

###

References

DOI

10.1016/j.plaphe.2025.100101

Original Source URl

https://doi.org/10.1016/j.plaphe.2025.100101

Funding information

This work was funded by the International S&T Cooperation Program of China (2024YFE0115000), the National Key R & D Program of China (2021YFD2000104), the National Natural Science Foundation of China (32371985), and the Fundamental Research Funds for Central Universities (226-2022-00217).

About Plant Phenomics

Plant Phenomics is dedicated to publishing novel research that will advance all aspects of plant phenotyping from the cell to the plant population levels using innovative combinations of sensor systems and data analytics. Plant Phenomics aims also to connect phenomics to other science domains, such as genomics, genetics, physiology, molecular biology, bioinformatics, statistics, mathematics, and computer sciences. Plant Phenomics should thus contribute to advance plant sciences and agriculture/forestry/horticulture by addressing key scientific challenges in the area of plant phenomics.

Title of original paper: Precise Image Color Correction Based on Dual Unmanned Aerial Vehicle Cooperative Flight
Authors: Xuqi Lu a b 1, Jiayang Xie a b 1, Jiayou Yan c, Ji Zhou d, Haiyan Cen a b
Journal: Plant Phenomics
Original Source URL: https://doi.org/10.1016/j.plaphe.2025.100101
DOI: 10.1016/j.plaphe.2025.100101
Latest article publication date: 5 September 2025
Subject of research: Not applicable
COI statement: The authors declare that they have no competing interests.
Attached files
  • Figure 2. Workflow for UAV image color correction and mosaicking. (a) Detection of ColorChecker boundary points in UAV imagery. (b) Calculation of the color correction matrix (CCM) for image color correction. (c) Mask generation based on the outward expanded ColorChecker boundary points. (d) Orthomosaic generation without masking. (e) Orthomosaic generation with masking. The white arrows in (d) indicate the interference of ColorChecker in the orthomosaic.
16/12/2025 TranSpread
Regions: North America, United States, Asia, China
Keywords: Applied science, Engineering

Disclaimer: AlphaGalileo is not responsible for the accuracy of content posted to AlphaGalileo by contributing institutions or for the use of any information through the AlphaGalileo system.

Testimonials

For well over a decade, in my capacity as a researcher, broadcaster, and producer, I have relied heavily on Alphagalileo.
All of my work trips have been planned around stories that I've found on this site.
The under embargo section allows us to plan ahead and the news releases enable us to find key experts.
Going through the tailored daily updates is the best way to start the day. It's such a critical service for me and many of my colleagues.
Koula Bouloukos, Senior manager, Editorial & Production Underknown
We have used AlphaGalileo since its foundation but frankly we need it more than ever now to ensure our research news is heard across Europe, Asia and North America. As one of the UK’s leading research universities we want to continue to work with other outstanding researchers in Europe. AlphaGalileo helps us to continue to bring our research story to them and the rest of the world.
Peter Dunn, Director of Press and Media Relations at the University of Warwick
AlphaGalileo has helped us more than double our reach at SciDev.Net. The service has enabled our journalists around the world to reach the mainstream media with articles about the impact of science on people in low- and middle-income countries, leading to big increases in the number of SciDev.Net articles that have been republished.
Ben Deighton, SciDevNet

We Work Closely With...


  • e
  • The Research Council of Norway
  • SciDevNet
  • Swiss National Science Foundation
  • iesResearch
Copyright 2025 by AlphaGalileo Terms Of Use Privacy Statement