Exascale supercomputer models a year’s weather in a day
Getting your Trinity Audio player ready...
|
Clouds might be good news when it comes to synchronizing data across devices and keeping apps up to date, but when we’re talking about the weather – rather than remote IT infrastructure – they can be more problematic. Cloud formations are notoriously difficult for researchers to model, and getting predictions wrong has serious consequences when it comes to understanding dangerous weather systems such as storms. But exascale supercomputer hardware could change that, based on recent results in the US.
Numerical weather prediction has made huge leaps since computers started to make their mark in the 1950s. Early systems required four hours to generate a 24-hour forecast and had no capacity to predict weather locally with grids spaced hundreds of kilometers apart. By the early 2000s, ensemble models were part of the mix thanks to supercomputers such as Cray’s T3E systems, which made it possible to run forecasts multiple times from different starting conditions. And doing so gave meteorologists a better indication of how likely future weather events would be.
Also, increased computing power allowed for smaller grid sizes, which – as well as making weather forecasts more local – opened the door to resolving convective showers and storms. For example, the UK Met Office – which today uses three Cray XC40 supercomputing systems to run an atmospheric model based on 215 billion global weather observations – introduced its first convective scale weather model in 2010.
Warm air rising and coming into contact with cooler air higher up in the atmosphere forms convective or cumuliform clouds. But because these convective weather events were small relative to the grid size, they couldn’t be described explicitly until model resolution improved to below 10 km. That’s not to say that cloud formation isn’t a massive event. Cumulus clouds, which take their name from the Latin word meaning ‘heap’ or ‘pile’, can weigh 500 tonnes.
One of the largest sources of uncertainty in climate projections
To recap, clouds form as water vapor heated by the sun rises and condenses around small particles in the atmosphere, such as sand, pollen, smoke, and more recently (on a geological timescale) industrial pollution. Clouds comprise hundreds of millions of droplets and are one of the largest sources of uncertainty in climate projections, according to experts.
Researchers need to run high-resolution algorithms to have a chance of representing cloud effects in weather and climate models. And a popular method of numerically simulating clouds and atmospheric turbulence is known as large-eddy simulation, which can account for the mixing of water vapor in the air.
But typically, computing resources only allow for small domain sizes, which are then repeated to account for behavior over a larger area. And there’s the scattering by cloud particles of radiant energy from the sun to take into account. Another reason why it is important to get cloud behavior right – to improve the accuracy of surface temperature predictions. Being able to treat convective storms and atmospheric motion more realistically helps in forecasting regional precipitation more accurately too.
Researchers use computer models for a variety of tasks, as Ruby Leung – a climate scientist based at Pacific Northwest National Laboratory – explains. Meteorologists may want to simulate a specific event, such as a hurricane and strong winds and the associated rainfall, which may lead to flooding. But climate scientists also use models to make predictions over longer time periods, comparing pre-industrial conditions to now and forecasting further ahead.
Small uncertainties can have a big impact on forecasts when projecting far into the future. And to narrow down that window, meteorologists are keen to utilize the processing power that an exascale supercomputer brings to account for as much detail as possible.
Energy Exascale Earth System Model
The Energy Exascale Earth System Model (E3SM) project is developing a new atmosphere model dubbed the Simple Cloud-Resolving E3SM Atmosphere Model (SCREAM), which is capable of using graphics processing unit (GPU) architectures to run efficiently at a global resolution of 3 km cell length. And thanks to a window in the US Department of Energy’s Exascale Computing Project, E3SM’s developers had the opportunity to see what version one of SCREAM was capable of.
In the 10 days of testing, the team used 8192 nodes (each containing 4 AMD M1250 GPUs) in the Frontier exascale supercomputer to set a record performance of greater than 1 simulated-years-per-day (SYPD). The model was run at 3.25 km global resolution with 128 vertical atmospheric layers.
“Running at this resolution globally requires tremendous computing resources in order to achieve the integration rates needed for multi-decadal climate simulations,” point out members of the E3SM project. “[And] achieving 1 SYPD on an Exascale machine represents a breakthrough.”