Tech

A future without clouds? Mystery at the heart of climate projections


We hear a lot about how climate change will change the land, sea and ice. But how will it affect? clouds?

“Low clouds can dry and shrink like icebergs,” says Michael Pritchard, professor of Earth System Science at UC Irvine. “Or they can thicken and become more reflective.”

Shallow clouds are formed by small-scale eddies as observed in nature. Researchers are using advanced computing to add higher resolution cloud dynamics to global simulations. Image credit: NOAA

These two scenarios will lead to very different future climates. And that, Pritchard said, is part of the problem.

“If you ask two different climate models what the future looks like when we add more CO2, you get two very different answers. And the main reason for this is the way clouds are included in climate models.”

There is no denying that clouds and aerosols – the pieces of soot and dust that make up cloud droplets – are an important part of the climate equation. The problem is that these phenomena occur on time scales and lengths that today’s models can’t come close to reproducing. Therefore, they are fed into the models through many approximations.

Analyzes of global climate models consistently show that clouds are the biggest source of uncertainty and instability.

Community code re-tools

While the most advanced US global climate model is having a hard time approaching 4 km global resolution, Pritchard estimates that models need at least 100 meters resolution to capture the vortices. Small-scale turbulent water forms shallow cloud systems – 40 times higher resolution in each direction. It could take up to 2060, according to Moore’s law, before the computing power is available to capture this level of detail.

Pritchard is working to overcome this glaring gap by breaking the climate modeling problem into two parts: a coarse-grained, lower-resolution (100km) planetary model, and multiple arrays of magnetic resolution. 100 to 200 meters. The two simulations run independently and then exchange data every 30 minutes to ensure that the simulations do not deviate and do not become unrealistic.

His team reported the results of these efforts in Journal of Advances in Earth System Modeling in April 2022. The study was supported by grants from the National Science Foundation (NSF) and the Department of Energy (DOE).

This method of climate simulation, called the ‘Multi-Level Modeling Framework (MMF)’, has been around since 2000 and has long been an option in Community Earth System Model (CESM) , developed at the National Center for Atmospheric Research. Recently, this idea has been revived at the Department of Energy, where researchers from Exascale Earth Energy System Model (E3SM) pushed it to new computational frontiers as part of Exascale computing project. Pritchard’s co-author Walter Hannah from Lawrence Livermore National Laboratory helps lead the effort.

Pritchard explains: “The modeling of the partial implementation revolves around the most difficult problem – modeling the entire planet. “It has thousands of microscopic models that capture things like shallow cloud formations that actually appear only at very high resolution.”

“The Multilayer Modeling Framework approach also makes sense,” says Mark Taylor, Computational Science Team Leader of the Earth System Modeling (E3SM) Project and a research scientist at Sandia National Laboratory. Ideal for DOE’s upcoming GPU-based exascale computers. “Each GPU has the horsepower to run hundreds of micro-models while matching the throughput of lower-resolution planetary models.”

Pritchard’s research and new approach were made possible in part by funding from NSF Frontera supercomputer at the Texas Advanced Computing Center (TACC). As the fastest university supercomputer in the world, Pritchard can run its models on Frontera at time scales and lengths accessible only on select systems in the US and test potential their to the cloud model.

“We have developed a way for the supercomputer to best split the work of simulating cloud physics across different regions of the world with different resolutions… so that it runs much faster.” wrote the research team.

Simulating the atmosphere in this way provides Pritchard with the resolution needed to capture the physical processes and turbulent eddies involved in cloud formation. The researchers showed that the multi-paradigm approach did not produce unwanted side effects even when patches using different cloud resolution mesh structures met.

“We were pleased to see that the difference was small,” he said. “This will provide new flexibility for all climate model users who want to focus high resolution in different places.”

Classifying and reconnecting the different scales of the CESM model was one of the challenges that Pritchard’s team overcame. Another involved reprogramming the model so that it could take advantage of the growing number of processors available on modern supercomputing systems.

Pritchard and his team – UCI postdoctoral scholar Liran Peng and University of Washington research scientist Peter Blossey – solved this problem by breaking down the internal domains of embedded cloud models. CESM into smaller pieces that can be solved in parallel using MPI or message transfer interfaces – a way of exchanging messages between multiple computers running parallel programs on distributed memory – and arranging the this calculation to use more processor.

“Doing so seems to have quadrupled the speed to great effect. That means I can be four times more ambitious for my cloud resolution models,” he said. “I am really optimistic that the dream of regionalization and disintegration of the MPI is leading to a completely different perspective of what could happen.”

Cloud machine learning

Pritchard sees another promising approach to machine learning, which his team has been exploring since 2017. “I was excited by how a stupid sheet of neurons can reproduce differential equations. how effectively each of these parts,” says Pritchard.

Pritchard’s research and new approach was made possible in part by the NSF-funded Frontera supercomputer at TACC. As the fastest university supercomputer in the world, Pritchard can run its models on Frontera at a time and at long scale accessible only on a handful of systems in the US and test potential their to the cloud model. Image credit: TACC

In one paper Submitted last fall, Pritchard, lead author Tom Beucler, of UCI, and others describe a machine learning approach that successfully predicts atmospheric conditions even in the climate regimes it untrained, where others have struggled to do so.

This ‘climate invariant’ model incorporates physical knowledge of climate processes into machine learning algorithms. Their research – using Stampede2 at TACC, Cheyenne at the National Center for Atmospheric Research, and Expanse at the San Diego Supercomputing Center – have shown that machine learning methods can maintain high accuracy across a variety of climates and geographies.

“If high-resolution cloud physics machine learning ever succeeds, it will transform everything about how we do climate simulations,” said Pritchard. “I am interested in seeing how efficiently and reliably the machine learning approach can reproduce in complex settings.”

Pritchard is well positioned to do so. He is on the Executive Committee of NSF Center for Understanding the Earth with Artificial Intelligence and Physics, or LEAP – a new Center for Science and Technology, to be funded by NSF in 2021 and directed by his longtime collaborator on the subject, Professor Pierre Gentine. LEAP brings together climate and data scientists to narrow the range of uncertainty in climate modeling, providing more accurate and actionable climate projections for immediate social impact .

“All the research I’ve done before is what I would call ‘limiting throughput.'” says Pritchard. However, if the goal is to create short simulations to train machine learning models, that’s a different context.”

Pritchard hopes to soon use the results of its 50-meter-long embedded models to begin building a large training library. “It’s a really good dataset to use with machine learning.”

But is AI maturing fast enough? Time is the key to finding out the destiny of the clouds.

“If those clouds shrink, like ice sheets, to reveal darker surfaces, that would amplify global warming and all the perils that come with it. But if they reverse the ice and thicken it, which they can do, it will be less dangerous. Some have estimated this to be a trillion dollar problem for society. And this has been around for a long time,” said Pritchard.

Simulation by simulation, the federally funded supercomputer is helping Pritchard and others access the answer to this important question.

“I am torn between genuine gratitude for the national computing infrastructure of the United States, which has been incredible in helping us develop and operate climate models,” said Pritchard. ,” and felt that we needed a new federally funded Manhattan Project and cross-sectoral coordination to really address this. ”

Source: TACC






Source link

news7g

News7g: Update the world's latest breaking news online of the day, breaking news, politics, society today, international mainstream news .Updated news 24/7: Entertainment, Sports...at the World everyday world. Hot news, images, video clips that are updated quickly and reliably

Related Articles

Back to top button