ӰƵ

News

Using Artificial Intelligence, Better Pollution Predictions Are in the Air

Fueled by increasing temperatures and droughts, severe wildfires are on the rise around the world — as are the smoke-borne contaminants that harm the environment and human health. In 2023, Canada recorded its worst wildfire season ever, with fires releasing more than 290 million tons of carbon into the atmosphere. California also experienced record-setting fire seasons in 2020 and 2021.

The side effects from this pollution range from irritating to deadly. Smoke from the Canadian wildfires drifted as far as , and set off air quality alerts in cities across the United States and Canada as it inflicted stinging eyes, stuffy noses and labored breathing on millions of people. The National Institutes of Health estimates all air pollution is responsible for every year globally.

“We know that dangerous air quality levels are a significant threat, but because exposure happens slowly, over time it is more difficult to quantify,” said Marisa Hughes, the climate intelligence lead at the Johns Hopkins Applied Physics Laboratory (APL) in Laurel, Maryland, and assistant manager of the Human and Machine Intelligence program. “A more accurate, higher-resolution model can help protect populations by providing them with information about air quality over time so that they can better plan ahead.”

Intelligent Weather Forecasting

To better understand where smoke pollutants will travel and when, researchers at APL and the are leveraging artificial intelligence (AI) to simulate atmospheric models. This family of APL projects will ultimately help forecasters deliver earlier, higher-resolution and more accurate predictions of the movements and evolution of air quality threats, like wildfire plumes.

Current weather forecasting methods rely on models, in which massive amounts of data — such as atmospheric composition, air temperature and pressure — are calculated in complex equations that follow the laws of physics, chemistry and atmospheric transport and produce simulations of future weather events. The amount of time that each model predicts is called a timestep; to predict further into the future, for multiple timesteps, the models need more computation power, data and time to analyze all the variables.

“In our case, the models are looking at the movement of nearly 200 different pollutants in the atmosphere for every timestep, sequentially. That’s approximately 40% of their computation,” said Principal Investigator Jennifer Sleeman, a senior AI researcher at APL. “And then they also have to consider how these chemicals are interacting with one another and how they’re decaying — the chemistry is approximately 30% of the computation. It takes a significant amount of computing power to perform air quality forecasting with all of the variables used.”

When it comes to forecasting, just one run of the model isn’t enough. Researchers use a technique called ensemble modeling, in which they run anywhere from a handful to hundreds of variations of models to account for possible changes in conditions — such as a cold spell or an incoming pressure system — and use the mean of those variations for forecasting.

“Running one model is computationally challenging — so imagine running 50-plus models. In some cases, this is just not feasible due to cost and computing availability,” said Sleeman.

This is where APL’s AI-assisted method improves upon the speed and accuracy of the forecast. The team developed deep-learning models that simulate ensembles while using fewer, shorter timesteps of input.

“The amount of computation we could save with our networks is tremendous,” said Sleeman. “We’re speeding things up because we’re asking the models to compute shorter timesteps, which is easier and faster to do, and we’re using the deep-learning emulator to simulate those ensembles and account for variations in weather data.”

This side-by-side video demonstrates the accuracy of the forecast performed by APL’s Deep-Learning Network compared to actual ozone data.

Credit: Johns Hopkins APL

The APL researchers and their collaborators at Morgan State University, NASA and NOAA applied the model to . Every day, the GEOS-CF produces a five-day global composition forecast at 25-kilometer resolution, or roughly 15 square miles.

“NASA and NOAA have been searching for ways to increase the resolution of these forecasts,” said Hughes. “If you live next to a power plant or a highway, the air quality impacts are going to affect you differently.”

Trained on a one-year simulation of a GEOS-CF-like system, which includes over 30 ensemble simulations, the deep-learning model has reliably produced 10-day forecasts that are mirroring ground-truth data. Where traditional models can require up to months’ worth of data to provide estimates, the deep-learning emulator only needed seven timesteps that encapsulated 21 hours of input data to produce accurate forecasts. By analyzing these models faster, researchers have laid the groundwork to forecast at higher resolutions.

Sleeman recently presented the team’s findings as well as other AI-assisted climate research at the Association for the Advancement of Artificial Intelligence’s Fall Symposium, at the and to the .

A Worldwide Effort

Both Hughes and Sleeman credit the advancements of the entire AI community in their efforts.

“If we tried the same thing five years ago, it might not have been as successful as it is today, because we’re building on the momentum of this accelerating research,” said Hughes. “We’re sharing our results and starting to see what methods and architectures are effective when you apply them to different problems around the world.”

This is one of several projects that is exploring additional applications of AI to climate intelligence challenges, such as forecasting climate tipping points, as part of the Laboratory’s growing efforts to ensure climate security.