The Dynamical Core of the Met Office Unified Model: the challenge of future supercomputer architectures
This decade is set to be an interesting one for operational weather and climate modelling. The accuracy of weather forecasts has reached unprecedented and probably unexpected levels: large-scale measures of accuracy continue to improve at the rate of 1 day every 10 years so that today's 3 day forecast is as accurate as the 1 day forecast was 20 years ago.
In order to maintain this level of improvement operational centres need to continue to increase the resolutions of their models. Increasingly this means running models at resolutions of the order of a kilometre. This leads to many challenges. One is how to handle processes that are only barely resolved at those scales. Another is how to present, and also verify, forecasts that are inherently uncertain due to the chaotic nature of the atmosphere.
A more practical issue though is simply how to run the models at these increased resolutions! To do so requires harnessing the power of some of the world's largest supercomputers which are entering a period of radical change in their architecture.
That challenge is made more difficult by the fact that the UK Met Office's model (the MetUM) is unified in that the same dynamical core (and increasingly also the same physics packages and settings) is used for all our operational weather and climate predictions. The model therefore has to perform well across a wide range of both spatial scales [O(10^0)-O(10^4)km] and temporal scales [O(10^0)-O(10^4) as well as a wide range of platforms.
This talk will start by outlining the current status of the MetUM, then discuss planned developments (focussing on numerical aspects) before going on to highlight recent progress within GungHo! - the project that is redesigning the dynamical core of the model.