AI-powered image generation sharpens accuracy in crop disease severity assessment
en-GBde-DEes-ESfr-FR

AI-powered image generation sharpens accuracy in crop disease severity assessment

20.09.2025 TranSpread

The method, called location-guided lesion representation learning (LLRL), overcomes the common challenge of background interference that often causes existing models to confuse healthy areas with lesions. By training the system to focus directly on diseased regions, the team achieved higher reliability in classifying disease severity across apple, potato, and tomato leaves.

Global food production must increase by 50% by 2050 to feed the growing population. Yet plant diseases already cut annual yields by 13%–22%, representing billions in agricultural losses worldwide. Traditional methods of assessing disease severity rely on human expertise or laboratory testing—both costly, time-intensive, and subjective. Advances in machine learning and deep learning have enabled automated recognition of plant diseases, often with over 90% accuracy. However, most models still struggle to distinguish between lesions and background features such as shadows, soil, or healthy tissue. These inaccuracies limit their effectiveness for guiding pesticide use. Based on these challenges, researchers developed a method that directly targets lesion areas, improving assessment reliability.

A study (DOI: 10.1016/j.plaphe.2025.100058) published in Plant Phenomics on 26 May 2025 by Qi Wang’s team, Guizhou University, enhances the accuracy and interpretability of plant leaf disease severity assessment, enabling more precise pesticide application and advancing sustainable agricultural management.

The location-guided lesion representation learning (LLRL) framework was designed to enhance the accuracy of plant leaf disease severity assessment by combining advanced network architectures with robust experimental validation. The system integrates three components: an image generation network (IG-Net), which employs a diffusion model to generate paired healthy–diseased images; a location-guided lesion representation learning network (LGR-Net), which leverages these pairs to isolate lesion areas and produce a dual-branch feature encoder (DBF-Enc) enriched with lesion-specific knowledge; and a hierarchical lesion fusion assessment network (HLFA-Net), which fuses these features to deliver precise severity classification. To validate the method, researchers built a dataset of 12,098 images covering apple, potato, and tomato leaf diseases, supplemented by more than 10,000 generated pairs, and implemented the experiments using Python 3.8.19 and the PyTorch 1.13.1 framework with GPU acceleration. LGR-Net was trained with the Adam optimizer, a weight decay of 1 × 10⁻⁴, and a scheduled learning rate decay across 4,000 iterations, while HLFA-Net was trained for 100 epochs at a fixed learning rate of 0.01, sharing and freezing the DBF-Enc module. When compared against 12 benchmark models across real, generated, and mixed datasets, LLRL consistently outperformed alternatives, achieving at least 1% higher accuracy and reaching up to 92.4% with the combined use of pre-training and attention mechanisms. Visualization experiments further confirmed its ability to precisely localize lesion regions (IoU = 0.934, F1 = 0.9615), while feature maps showed progressive concentration on lesions at lower resolutions. Grad-CAM analysis revealed attention patterns that shifted toward lesions with increasing severity, aligning with established pathology knowledge. The framework demonstrated strong generalization across crop species, with particularly robust results for tomato and potato datasets, highlighting its potential as a versatile and reliable tool for agricultural disease management.

By enabling accurate grading of disease severity, LLRL provides a powerful foundation for precision pesticide application. Farmers could use smartphone photos of leaves to instantly assess disease progression and receive guidance on dosage and timing. At larger scales, drones and satellite imaging could integrate the system for automated monitoring across entire fields, significantly reducing manual inspection demands. This not only saves costs but also minimizes unnecessary pesticide use, reducing environmental pollution and safeguarding farmer income.

###

References

DOI

10.1016/j.plaphe.2025.100058

Original URL

https://doi.org/10.1016/j.plaphe.2025.100058

Funding information

This research was supported by the National Key R&D Program of China (2024YFE0214300), Guizhou Provincial Science and Technology Projects ([2024]002, CXTD[2023]027), Guizhou Province Youth Science and Technology Talent Project ([2024]317), Guiyang Guian Science and Technology Talent Training Project ([2024] 2-15), the Talent Introduction Program of Guizhou University under Grant No. (2021)89.

About Plant Phenomics

Plant Phenomics is dedicated to publishing novel research that will advance all aspects of plant phenotyping from the cell to the plant population levels using innovative combinations of sensor systems and data analytics. Plant Phenomics aims also to connect phenomics to other science domains, such as genomics, genetics, physiology, molecular biology, bioinformatics, statistics, mathematics, and computer sciences. Plant Phenomics should thus contribute to advance plant sciences and agriculture/forestry/horticulture by addressing key scientific challenges in the area of plant phenomics.

Title of original paper: Location-guided lesions representation learning via image generation for assessing plant leaf diseases severity
Authors: Ya Yu 1, Xingcai Wu 1, Peijia Yu, Qiaoling Wan, Yujiao Dan, Yuanyuan Xiao, Qi Wang
Journal: Plant Phenomics
Original Source URL: https://doi.org/10.1016/j.plaphe.2025.100058
DOI: 10.1016/j.plaphe.2025.100058
Latest article publication date: 26 May 2025
Subject of research: Not applicable
COI statement: The authors declare that they have no competing interests.
Angehängte Dokumente
  • Figure 3. Framework of our approach. The approach consists of an image generation network (IG-Net), location-guided lesion representation learning network (LGR-Net), and hierarchical lesion fusion evaluation network (HLFA-Net).
20.09.2025 TranSpread
Regions: North America, United States
Keywords: Applied science, Engineering

Disclaimer: AlphaGalileo is not responsible for the accuracy of content posted to AlphaGalileo by contributing institutions or for the use of any information through the AlphaGalileo system.

Referenzen

We have used AlphaGalileo since its foundation but frankly we need it more than ever now to ensure our research news is heard across Europe, Asia and North America. As one of the UK’s leading research universities we want to continue to work with other outstanding researchers in Europe. AlphaGalileo helps us to continue to bring our research story to them and the rest of the world.
Peter Dunn, Director of Press and Media Relations at the University of Warwick
AlphaGalileo has helped us more than double our reach at SciDev.Net. The service has enabled our journalists around the world to reach the mainstream media with articles about the impact of science on people in low- and middle-income countries, leading to big increases in the number of SciDev.Net articles that have been republished.
Ben Deighton, SciDevNet
AlphaGalileo is a great source of global research news. I use it regularly.
Robert Lee Hotz, LA Times

Wir arbeiten eng zusammen mit...


  • e
  • The Research Council of Norway
  • SciDevNet
  • Swiss National Science Foundation
  • iesResearch
Copyright 2025 by DNN Corp Terms Of Use Privacy Statement