Seeing beyond the visible: Researchers turn everyday cameras into crop analysis tools
en-GBde-DEes-ESfr-FR

Seeing beyond the visible: Researchers turn everyday cameras into crop analysis tools


URBANA, Ill., USA – Agricultural producers and manufacturers often need information about crop attributes, from nutrient content to chemical composition, to make management decisions. In recent years, multispectral imaging has emerged as a useful tool for product analysis, but the required equipment is expensive. Standard RGB cameras are much more affordable, but their images show only visible attributes.

However, if RGB images can be “translated” to multispectral images, pictures taken with a smartphone or any regular camera can yield sophisticated information. This process requires complex computer modeling and machine learning, but once the techniques are developed, they can be applied to simple devices anyone can use.

In two new papers, researchers at the University of Illinois Urbana-Champaign explore the reconstruction of multispectral and hyperspectral images from RGB for chemical analysis of sweet potatoes and maize.

“An RGB camera captures only the visible range in three bands, red, green, and blue. The pictures cannot provide any chemical information, which you often need for crop analysis. We reconstructed images from these three bands to include information from the near-infrared range, which you can use to determine chemical composition,” said Mohammed Kamruzzaman, assistant professor in the Department of Agricultural and Biological Engineering (ABE), part of the College of Agricultural, Consumer and Environmental Sciences and The Grainger College of Engineering at the U. of I. He is corresponding author on both studies.

“This work has many potential applications in the agricultural industry and can significantly lower costs. While a multispectral camera costs $10,000 or more, you can get an RGB camera for a few hundred dollars,” he added.

Analysis of sweet potato attributes

In the first paper, the researchers provide a large dataset of reconstructed images for chemical analysis of sweet potatoes that anyone can access and use for their own modeling.

“Most existing image reconstruction models focus on non-biological objects like tables and chairs, which are very different from biological objects. Our goal was to create an RGB-to-hyperspectral image dataset for a biological sample and make it publicly available,” said lead author Ocean Monjur, doctoral student in ABE.

Sweet potatoes are a popular food source, and they are also used for a wide range of industrial purposes including textiles, biodegradable polymers, and biofuels. Assessing quality attributes such as brix (sugar content), moisture, and dry matter is important for determining the usage and value of potatoes. Chemical laboratory analysis is time-consuming and destroys the samples. Hyperspectral imaging (HSI) is fast, accurate, and non-destructive, but it is expensive and complicated.

That’s why the researchers created Agro-HSR, a large database of reconstructed RGB to HSI images for the agricultural industry. The dataset includes 1322 image pairs from 790 sweet potato samples, collected from one or both sides of each potato. For 141 potato samples, they measured brix, firmness, and moisture content to evaluate the accuracy of the reconstructed images, finding them to be highly correlated with the actual measurements.

They tested their dataset on five popular hyperspectral imaging reconstruction models to determine which performed best, finding that two models (Restormer and MST++) consistently outperformed the others on all metrics.

“To our knowledge, this is the largest dataset for hyperspectral image reconstruction, not just for agriculture but overall. We are providing this database so anyone can use it to train or develop their own models, including models for other agricultural products,” Kamruzzaman said.

Evaluating chlorophyll content for maize growth

In the second paper, the researchers describe a novel method for multispectral image reconstruction to analyze chlorophyll content in maize. They also introduce a simple device that people can use to take pictures in the field and get immediate results.

“Our target measure is chlorophyll content, which is an indicator of plant growth. With this device you can take a picture, get the chlorophyll content, and determine the crop’s growth status,” Kamruzzaman said.

To develop their model, the researchers collected images from three different locations: a research field in Hengshui, China; the U. of I. Plant Biology Greenhouse; and the U. of I. Vegetable Crops Research Farm.

At each location, they divided the area into varying levels of soil fertility, and at the Illinois research farm, they subjected the maize to three levels of stress by flooding throughout the growth period.

In all of these settings, they tested several modeling approaches to reconstructing multispectral images from RGB. Based on their findings, they created a novel model called Window-Adaptive Spatial-Spectral Attention Transformer (WASSAT), which more accurately aligned with the actual data.

“We combined spectral and spatial attention modes to establish an adaptive window that can discern crops from soil and other elements, capturing the complexity of a field environment. Then we reconstructed 10-band images to predict chlorophyll content, and we found our results performed better than other models,” said lead author Di Song, doctoral student in ABE.

“We have developed a handheld device that incorporates the model. You can use it to take an RGB image, which will be converted to a multispectral image that provides much more information,” he said. “Next, we plan to add a prediction model, so the farmer can simply take a picture and get the chlorophyll content without having to interpret the images.”

This approach offers a cost-effective solution for accurate crop monitoring, enabling precise growth assessment and stress detection, the researchers concluded in the paper.

The first paper, “Agro-HSR: The first large-scale agricultural-focused hyperspectral dataset for deep learning-based image reconstruction and quality prediction,” is published in Computers and Electronics in Agriculture [DOI:10.1016/j.compag.2025.111103]. Additional authors are Md. Toukir Ahmed and Girish Chowdhary. This work was funded by the U.S. Department of Agriculture Agricultural Marketing Service through the Specialty Crop Multistate Program grant AM21SCMPMS1010.

The second paper, “Multispectral image reconstruction from RGB image for maize growth status monitoring based on window-adaptive spatial-spectral attention transformer,” is also published in Computers and Electronics in Agriculture [DOI:10.1016/j.compag.2025.111062]. Additional authors are Hong Sun and Esther Ngumbi.

Research in the College of ACES is made possible in part by Hatch funding from USDA’s National Institute of Food and Agriculture.

The first paper, “Agro-HSR: The first large-scale agricultural-focused hyperspectral dataset for deep learning-based image reconstruction and quality prediction,” is published in Computers and Electronics in Agriculture [DOI:10.1016/j.compag.2025.111103]. Additional authors are Md. Toukir Ahmed and Girish Chowdhary. This work was funded by the U.S. Department of Agriculture Agricultural Marketing Service through the Specialty Crop Multistate Program grant AM21SCMPMS1010.

The second paper, “Multispectral image reconstruction from RGB image for maize growth status monitoring based on window-adaptive spatial-spectral attention transformer,” is also published in Computers and Electronics in Agriculture [DOI:10.1016/j.compag.2025.111062]. Additional authors are Hong Sun and Esther Ngumbi.
Attached files
  • Di Song, doctoral student in Agricultural and Biological Engineering at the University of Illinois, photographs maize plants.
  • Ocean Mansur, doctoral student in Agricultural and Biological Engineering at the University of Illinois, photographs a sweet potato.
Regions: North America, United States, Asia, China, Extraterrestrial, Sun, Europe, United Kingdom
Keywords: Science, Agriculture & fishing, Applied science, Artificial Intelligence, Technology

Disclaimer: AlphaGalileo is not responsible for the accuracy of content posted to AlphaGalileo by contributing institutions or for the use of any information through the AlphaGalileo system.

Testimonials

For well over a decade, in my capacity as a researcher, broadcaster, and producer, I have relied heavily on Alphagalileo.
All of my work trips have been planned around stories that I've found on this site.
The under embargo section allows us to plan ahead and the news releases enable us to find key experts.
Going through the tailored daily updates is the best way to start the day. It's such a critical service for me and many of my colleagues.
Koula Bouloukos, Senior manager, Editorial & Production Underknown
We have used AlphaGalileo since its foundation but frankly we need it more than ever now to ensure our research news is heard across Europe, Asia and North America. As one of the UK’s leading research universities we want to continue to work with other outstanding researchers in Europe. AlphaGalileo helps us to continue to bring our research story to them and the rest of the world.
Peter Dunn, Director of Press and Media Relations at the University of Warwick
AlphaGalileo has helped us more than double our reach at SciDev.Net. The service has enabled our journalists around the world to reach the mainstream media with articles about the impact of science on people in low- and middle-income countries, leading to big increases in the number of SciDev.Net articles that have been republished.
Ben Deighton, SciDevNet

We Work Closely With...


  • e
  • The Research Council of Norway
  • SciDevNet
  • Swiss National Science Foundation
  • iesResearch
Copyright 2025 by AlphaGalileo Terms Of Use Privacy Statement