Hyperspectral ocean color observations can reveal subtle signals linked to phytoplankton, colored dissolved organic matter, suspended particles, harmful algal blooms, and carbon cycling. NASA’s Plankton, Aerosol, Cloud, ocean Ecosystem (PACE) mission now provides this type of detailed spectral coverage, but its record is still very short. By contrast, Aqua Aqua moderate-resolution imaging spectrometer (MODIS) has collected ocean color data for more than two decades, though with far fewer bands. This mismatch has limited scientists’ ability to combine long-term continuity with fine spectral detail. Because of these challenges, it is necessary to carry out in-depth research on reconstructing hyperspectral reflectance from legacy multispectral satellite observations.
Researchers from the Institute of Oceanology, Chinese Academy of Sciences, together with the University of Chinese Academy of Sciences, reported (DOI: 10.34133/remotesensing.1036) the study in Journal of Remote Sensing on February 5, 2026. Their work introduces spectral-attention conditional GAN (SA-CGAN), a spectral-attention conditional generative adversarial network designed to reconstruct hyperspectral remote sensing reflectance from Aqua MODIS observations using PACE OCI as the reference. The goal is to overcome a major challenge in ocean remote sensing: how to extend the value of the new hyperspectral era backward in time using older multispectral archives.
The proposed model consistently outperformed earlier empirical reconstruction methods across the 400 to 700 nm range. It maintained R² values above 0.90 across all wavelengths, while competing methods showed larger drops, especially in difficult spectral regions. The strongest gains appeared below 450 nm, in pigment-rich waters, and in the red-edge region where conventional models often struggle. These improvements came from combining physical sensor constraints, spectral attention, and adversarial learning in one framework.
The team trained SA-CGAN using about one year of globally paired PACE OCI and Aqua MODIS data. The input consisted of 10-band MODIS multispectral reflectance, while the output was reconstructed into 136 OCI-like bands between 400 and 700 nm. To improve realism and physical consistency, the network included three key elements: a generator-discriminator adversarial structure, a spectral attention mechanism that emphasized informative bands, and a spectral response function constraint that forced reconstructed spectra to remain consistent with MODIS band measurements. The model was trained for 150 epochs with a learning rate of 0.001, and each epoch took about 5 minutes on an NVIDIA GeForce RTX 4090 GPU. Validation showed R² values ranging from 0.92 to 0.96 across the visible spectrum. In open-ocean waters, reconstruction differences in blue bands were often below 5%, though errors rose in red bands and complex coastal waters where signals are weaker and optical conditions are more variable.
The study suggests that physically informed AI can do more than interpolate missing data; it can help connect satellite generations into a more coherent long-term observing system. The authors note that SA-CGAN preserves key spectral information important for oceanographic analysis, although they also caution that uncertainty in the reconstructed reflectance residuals in the range of 413 to 490 nm means some pigment-retrieval applications should use the reconstructed data carefully.
This approach could greatly expand the scientific value of older ocean color archives by giving them hyperspectral-like detail. That would support longer investigations of marine ecosystem change, water quality, biogeochemical cycles, and carbon uptake on a global scale. The method may also help bridge future sensor transitions, making satellite records more consistent over time. Even so, the study highlights that coastal waters, low-signal red bands, and some pigment-sensitive applications still require caution, suggesting that future work may focus on local tuning, quality screening, and multi-sensor fusion.
###
References
DOI
10.34133/remotesensing.1036
Original Source URL
https://spj.science.org/doi/10.34133/remotesensing.1036
Funding information
This work was supported by the National Natural Science Foundation of China (grant number 42076200), in part by the Key Research and Development Program of Shandong Province (grant number 2022CXPT020) and the National Natural Science Foundation of China (grant number 42221005, 42090044, and 42206202).
About Journal of Remote Sensing
The Journal of Remote Sensing, an online-only Open Access journal published in association with AIR-CAS, promotes the theory, science, and technology of remote sensing, as well as interdisciplinary research within earth and information science.