# Workshop Programme

## for period 22 - 25 October 2012

### Weather and climate prediction on next generation supercomputers

22 - 25 October 2012

Timetable

 Monday 22 October 12:35-13:30 Welcome and Opening 13:30-14:15 Nigel Wood (Met Office) The Dynamical Core of the Met Office Unified Model: the challenge of future supercomputer architectures This decade is set to be an interesting one for operational weather and climate modelling. The accuracy of weather forecasts has reached unprecedented and probably unexpected levels: large-scale measures of accuracy continue to improve at the rate of 1 day every 10 years so that today's 3 day forecast is as accurate as the 1 day forecast was 20 years ago. In order to maintain this level of improvement operational centres need to continue to increase the resolutions of their models. Increasingly this means running models at resolutions of the order of a kilometre. This leads to many challenges. One is how to handle processes that are only barely resolved at those scales. Another is how to present, and also verify, forecasts that are inherently uncertain due to the chaotic nature of the atmosphere. A more practical issue though is simply how to run the models at these increased resolutions! To do so requires harnessing the power of some of the world's largest supercomputers which are entering a period of radical change in their architecture. That challenge is made more difficult by the fact that the UK Met Office's model (the MetUM) is unified in that the same dynamical core (and increasingly also the same physics packages and settings) is used for all our operational weather and climate predictions. The model therefore has to perform well across a wide range of both spatial scales [O(10^0)-O(10^4)km] and temporal scales [O(10^0)-O(10^4) as well as a wide range of platforms. This talk will start by outlining the current status of the MetUM, then discuss planned developments (focussing on numerical aspects) before going on to highlight recent progress within GungHo! - the project that is redesigning the dynamical core of the model. 14:15-14:40 Mikhail Tolstykh (Russian Academy of Sciences) Development of the next generation SLAV global atmospheric model SLAV is the global finite-difference semi-Lagrangian numerical weather prediction model used operationally at Hydrometcentre of Russia. Its features are the use of vorticity-divergence formulation (in horizontal plane) on the unstaggered grid and 4th-order finite differences. The current version in work has the resolution of (0.18-0.22) degrees in latitude, 0.225 degrees in longitude, 51 levels. The presentation covers two topics: 1. Further development of the existing hydrostatic version. This includes: - Implementation of the mass-conserving semi-Lagrangian advection on the reduced lat-lon grid. This is a 3D extension of (Tolstykh, Shashkin, JCP 2012). Some preliminary results will be shown. - Recent increase of code scalability from 160 to more than 800 cores. - Work on further increase of scalability. 2. Plans for development of the global nonhydrostatic model. We plan to test a parallel elliptic solver (the code developed at INM) on different massively parallel platforms with a matrix arising in the semi-implicit discretization of MC2-type nonhydrostatic model formulation. Depending on the results, we will choose between the semi-implicit and horizontally explicit-vertically implicit (HE-VI) time integration schemes. The potential choice of HE-VI would imply the radical changes in the next generation of our dynamical core. These changes will be also discussed in the presentation. 14:40-15:10 Afternoon Coffee 15:10-15:55 Nils Wedi (ECMWF) ECMWF's roadmap for non-hydrostatic modelling, existing and future challenges and recent progress in solving these 16:00-16:25 Bill Skamarock (National Center for Atmospheric Research) A cell-integrated SLSI shallow-water model with conservative and consistent mass and scalar mass transport 16:25-16:45 Afternoon Tea 16:45-17:20 Richard Loft (UCAR) G8 ECS: Enabling climate simulation at extreme scale 17:30-18:30 Welcome reception
 Wednesday 24 October 09:30-10:05 John McGregor (CSIRO) Cube-based atmospheric GCMs at CSIRO Two cube-based atmospheric GCMs have been developed at CSIRO, the Conformal-Cubic Atmospheric Model (CCAM) and, more recently, the Variable Cubic Atmospheric Model (VCAM). The design of the dynamical cores of both models will be described and compared. CCAM is formulated on the conformal-cubic grid, and employs 2-time-level semi-Lagrangian semi-implicit numerics. On the other hand, VCAM is cast on the highly-uniform equiangular gnomonic-cubic grid and employs a split-explicit flux-conserving approach, which provides benefits for modelling trace gases. Both models use reversible staggering for the wind components (McGregor, MWR, 2005) to produce good wave dispersion behaviour. The models use the same physics package. CCAM includes the Miller-White nonhydrostatic treatment, whereas VCAM is presently an hydrostatic model. Both models use an efficient MPI message-passing strategy. Although VCAM avoids the message-passing overheads necessitated by the Helmholtz solver of CCAM, it instead has some minor overheads related to more frequent calls to the wind staggering/unstaggering routines. Timings will be shown for simulations utilising up to 288 processors. Comparative model performance will be shown for idealized advection tests, the Held-Suarez test case, aquaplanet simulations, and for AMIP simulations. 10:05-10:40 Stephane Popinet (NIWA) Quadtree-adaptive global atmospheric modelling on parallel systems I will present initial results for a three-dimensional, hydrostatic, global atmospheric model combining quadtree, horizontal adaptivity on a cubed sphere grid with a standard, layered vertical discretisation. This model is implemented within the Gerris framework (http://gfs.sf.net) and thus automatically inherits attributes such as data parallelism and load-balancing. For large-scale atmospheric simulations, I will show that quadtree-adaptivity leads to a large gain in the scaling exponent relating computational cost to the resolution of sharp frontal structures. 10:40-11:00 Morning Coffee 11:00-11:45 Mark Taylor (Sandia) High resolution and variable resolution capabilities of the Community Atmosphere Model (CAM) with a spectral finite element dynamical core I will describe our work developing CAM-SE, a highly scalable version of the Community Atmosphere Model (CAM) running with the spectral element dynamical core from NCAR's High-Order Method Modeling Environment. For global 1/4 and 1/8 degree resolutions CAM-SE runs efficiently on hundreds of thousands of processors on modern supercomputers and obtains excellent simulation throughput. CAM-SE also supports fully unstructured conforming quadrilateral grids. I will show results using a variable resolution grid with 1/8 degree resolution over the central U.S., transitioning to 1 degree over most of the globe. We hope that the variable resolution can provide a 10-100 times more efficient way to calibrate and evaluate the CAM 1/8 degree configuration. CAM-SE uses quadrilateral elements and tensor-product Gauss-Lobatto quadrature. Its fundamental computational kernels look like dense matrix-vector products which map well to upcoming computer architectures. It solves the hydrostatic equations with a spectral element horizontal descritization and the hybrid coordinate Simmons & Burridge (1981) vertical discretization. It uses a mimetic formulation of spectral elements which preserves the adjoint and annihilator properties of the divergence, gradient and curl operations. These mimetic properties result in local conservation (to machine precision) of mass, tracer mass and (2D) potential vorticity, and semi-discrete conservation (exact with exact time-discretization) of total energy. Hyper-viscsoity is used for all numerical dissipation. 11:45-12:10 Colin M. Zarzycki; Christiane Jablonowski (University of Michigan) Evaluating Variable-Resolution CAM-SE as a Numerical Weather Prediction Tool The global modeling community has traditionally struggled simulating meso-alpha and meso-beta scale (25-500 km) systems in the atmosphere such as tropical cyclones, strong fronts, and squall lines. With traditional General Circulation Model (GCM) resolutions of 50-300 km, these features have been under-resolved and require significant parameterization at the sub-grid scale. In an effort to help alleviate these issues, the use of limited area models (LAMs) with high resolution has become popular, although, by definition, these models typically lack two-way communication with the exterior domain. Variable-resolution global dynamical models can serve as the bridge between traditional global forecast models and high-resolution LAMs by applying fine grid spacing in areas of interest. These models can utilize existing computing platforms to model high resolutions on a regional basis while maintaining global continuity, therefore eliminating the need for externally-forced and possib ly numerically and physically inconsistent boundary conditions required by LAMs. A statically-nested, variable-mesh option has recently been introduced into the National Center for Atmospheric Research (NCAR) Community Atmosphere Model's (CAM) Spectral Element (SE) dynamical core. We present short-term CAM-SE model simulations of historical tropical cyclones and compare the model's prediction of storm track and intensity to other global and regional models used operationally by hurricane forecast centers. Additionally, we explore the model's ability to simulate other weather phenomenon traditionally unavailable to global modelers such as mesoscale convective systems and precipitation lines associated with frontal passages. We also discuss the performance of existing parameterizations in CAM with respect to high-resolution modeling as well as consider the potential computational benefits in using a variable-resolution setup as an operational tool for both weather and climate prediction. 12:10-12:35 Marcus J Thatcher; John L McGregor (CSIRO) A prototype model for coupled simulations of regional climate suitable for massively parallel architectures The formulation of regional climate models has been undergoing major changes, including advances in variable-resolution models and attempts to simulate regionally the coupled atmosphere-ocean system. This talk outlines the design of a prototype global variable-resolution, coupled atmosphere-ocean model. Although the grid can be smoothly deformed into a global simulation, the climate model has been optimised for regional simulations where the grid can be focused over a specified location using a Schmidt transformation. Both atmosphere and ocean dynamical cores employ a reversible staggerring between Arakawa A and C grids. Theoretically this approach can produce very good dispersion characteristics for both atmosphere and ocean models. The performance of the model scales well for 300+ processors and is expected to be suitable for massively parallel architectures, as the approach avoids latency problems associated with mismatched atmosphere and ocean grids. Furthermore, th e approach could be appropriate for global climate models if computing resources are increased by a factor of 10 with the next generation of supercomputers. We can suppress error growth on the coarser regions of the variable-resolution grid by downscaling with a system of scale-selective filters, where the filters use an efficient convolution-based approach that can operate with non-periodic boundary conditions and irregular coastlines, in the case of the ocean model. Some preliminary results are presented for practical applications of the model simulating regional climate, as well as a discussion of the algorithms used for the reversible staggering and the scale-selective filters. 12:35-13:30 Lunch 13:30-14:15 Tom Edwards (Cray Inc.) Earth system modeling strategies on extreme scale architectures Achieving the highest possible performance capability is a key requirement for earth system modeling community. Extreme scale architectures, including those currently available, provide opportunities for the advancement of simulation capabilities and present challenges for the HPC community as a whole. A number of significant factors have been identified in the development and deployment of Exascale systems. Approaches to address these challenges will strongly influence modeling strategies. As we enter into the Exascale era, a determinant for success will be greater levels of cooperation between model developers and the broader HPC community. Several modeling groups have already engaged in co-development approaches to identify and address factors limiting performance, scalability and efficiency. A further consideration moving forward will be the impact on HPC architectures and workflows as the science community becomes increasingly engaged with data intensive approaches. 14:15-14:40 Eike Mueller; Robert Scheichl (University of Bath) Scalability of Elliptic Solvers in Numerical Weather and Climate Prediction Numerical weather- and climate- prediction requires the solution of elliptic partial differential equations with a large number of unknowns on a spherical grid. In particular, if implicit time stepping is used in the dynamical core of the forecast model, an elliptic PDE has to be solved at each time step. This often amounts to a significant proportion of the model runtime. The goal of the Next Generation Weather and Climate Prediction (NGWCP) project is the development of a new dynamical core for the UK Met Office Unified Model with a significantly increased global model resolution, resulting in more than $10^{10}$ degrees of freedom for each atmospheric variable. To run the model operationally, the solver has to scale to hundreds of thousands of processor cores on modern computer architectures. To investigate the scalability of the implicit time stepping algorithm we have tested and optimised existing solvers in the Distributed and Unified Numerics Environment (DUNE) and the Hypre library. In addition we also implemented a matrix-free parallel geometric multigrid code with a vertical line smoother. We demonstrate the scalability of the solvers on up to 65536 cores of the Hector supercomputer for a system with $10^{10}$ degrees of freedom for the elliptic PDE arising from semi-implicit semi-Lagrangian time stepping. To identify the most promising solver we investigated the robustness of simple and widely used preconditioners, such as vertical line relaxation, and more advanced multigrid methods. We compared algebraic- and matrix-free geometric multigrid algorithms to quantify the matrix- and coarse-grid- setup costs and studied the performance of various solvers on different computer architectures. 14:40-15:10 Afternoon coffee 15:10-15:55 Max Gunzberger (Florida State) Parallel Algorithm for Spherical Delaunay Triangulations and Spherical Centroidal Voronoi Tessellations 16:00-16:25 Robert Scheichl (University of Bath) Multilevel Markov-Chain Monte Carlo Methods for Large Scale Problems Monte Carlo methods play a central role in stochastic uncertainty quantification and data assimilation. In particular Markov chain Monte Carlo methods are of great interest also in the atmospheric sciences. However, they are notorious for their slow convergence and high computational cost. In this talk I will present revolutionary recent developments to mitigate and overcome this serious problem using a novel multilevel strategy and deterministic sampling rules. The talk will focus on methodology. The applications are so far mainly coming from other fields. 16:25-16:45 Afternoon tea 16:45-17:20 Richard Loft (UCAR) Meeting the Challenge of Many-core Architectures in Weather and Climate Models Many-core processor systems such as GPU's and Intel's Xeon Phi achieve higher theoretical performance and improved power efficiency by a trading a decrease in clock speed for an increase in the number of compute threads. The questions relevant to this meeting are: 1) Do these architectures offer real benefits in performance over conventional multiprocessors for climate and weather applications? 2) If so, is it worth refactoring these large, complex applications to achieve these benefits? Over the past few years, many weather and a few climate groups around the world have been trying to answer these questions. This talk will survey the their progress and experiences, as presented by them at a series of many-core workshops held at NCAR over the past two years. Specific topics will include: the right and wrong way to measure, report, and think about many-core performance; assessment of the various programming paradigms currently available for the processor + many-core accelerator architecture; experience with different compilers and tools; and the viability of the code refactoring strategies for many-core processors that have been tried. 19:30-22:00 Workshop dinner