AI model uses physics to correct remote sensing data

By John Roach

Turbulence, temperature changes, water vapor, carbon dioxide, ozone, methane, and other gases absorb, reflect, and scatter sunlight as it passes through the atmosphere, bounces off the Earth’s surface, and is collected by a sensor on a remote sensing satellite. As a result, the spectral data received by the sensor is distorted.

Scientists know this and have developed several methods to explain the corrupting influence of the atmosphere on remote sensing data.

“This problem is as old as aerial imagery,” said James Kocha data scientist at Pacific Northwest National Laboratory (PNNL) who has developed a new way to solve the problem using a branch of artificial intelligence called physics-based machine learning and at the same time improves remote sensing capabilities.

Koch presented a paper describing his physics-based machine learning framework last week at the International Symposium on Geosciences and Remote Sensing in Athens, Greece. This work is part of PNNL’s Remote Exploitation Capability and was supported by PNNL’s Laboratory-Led Research and Development portfolio.

Scientists can solve the problem of atmospheric corruption because they understand the physics of how the atmosphere distorts sunlight as it passes through the atmosphere. This allows them to remove the influence of the atmosphere from the data collected by the sensor. This process is called atmospheric correction. An atmospheric transmission profile is usually a necessary prerequisite for performing an atmospheric correction. The profile is a representation of the properties and composition of the atmosphere at different altitudes that shows how light at different wavelengths interacts with an atmosphere.

It is in the process of creating an atmospheric transmission profile without prior knowledge that Koch’s AI technique has the potential to be a game changer.

Today, many atmospheric correction applications rely on off-the-shelf tools that use generic, statistically based atmospheric profiles. These tools are sufficient for urgent tasks such as disaster response monitoring and are cost-effective for large-area mapping. Applications where high accuracy is paramount, such as target detection, require the creation of high-fidelity, data-intensive, and computationally expensive profiles.

“I took some of the methods that subject matter experts use and integrated them into a machine learning pipeline so that we can do this process in a data-driven way,” Koch said. “This is an intermediate approach when higher fidelity is required, but we don’t necessarily have all the resources to identify all the properties associated with the atmosphere. We use the data that is available.”

Physics-based machine learning

To train and evaluate the machine learning pipeline, Koch used a dataset of labeled aerial images from Cook City, Montana, that includes cars and pieces of fabric with known spectral signatures. He used 112 of them, or 0.05% of those available at the scene, and ran the training tests on a mid-range laptop.

The trained model can take pixels from any spectral scene to infer an atmospheric transmission profile and automatically perform atmospheric correction. At the heart of the approach is a series of differential equations that describe how sunlight changes as it passes through the atmosphere, bounces off a target, re-enters the atmosphere, and hits a sensor.

“The differential equation constraint, that is, physics-based machine learning, is the secret to making sure this works well,” Koch said. “By construction, this model can make a prediction that will satisfy first-order physics.”

In addition to a performance that falls in the middle range between standard models and the high-fidelity approach, Koch’s framework is bidirectional: it can both remove the influence of the atmosphere from a spectral scene collected by a remote sensor and infer how a ground material would appear if photographed through a particular atmosphere.

“Some things are highlighted or hidden depending on where you look at them,” Koch says. “It’s not a one-size-fits-all solution. You have to look and probe where things are most fruitful.”

Real World Research

Remote sensing is used for tasks ranging from drought and vegetation indices that track changes in photosynthetic activity and water content over time to detecting methane plumes, activity at foreign military bases, and human traffic at border crossings.

Different atmospheric correction approaches are applied to different scenarios, depending on factors such as time, cost and available data.

Luis Cedillo, a PNNL intern and undergraduate student at the University of Texas at El Paso, presented a conference poster at SPIE Defense and Commercial Sensing 2024 in National Harbor, Maryland, on using physics-based machine learning for coastal ecosystem health monitoring. He used the machine learning pipeline to jointly learn the profile of the atmosphere and coastal waters, unlocking a new capability for monitoring coral reef health from satellites.

The researchers are currently refining their approach for applications where data is limited but high fidelity is required, such as target detection.

“The main advantage here is that we can achieve good accuracy with a limited amount of data without having to rely on a lot of prior knowledge, such as the position of the sensor or the position of the sun,” Koch said. “We learn these things as we go.”

/Public Release. This material from the original organization/authors may be of a timely nature and edited for clarity, style, and length. Mirage.News takes no institutional position or bias, and all views, positions, and conclusions expressed herein are solely those of the author(s).See full article here.

Leave a Comment