skip to content

SCALE-LES: Strategic development of large eddy simulation suitable to the future HPC

Presented by: 
Hirofumi Tomita [RIKEN/AICS]
Friday 28th September 2012 -
13:30 to 13:55
INI Seminar Room 1
Session Title: 
Novel Optimisation Techniques
The Large Eddy Simulation is a vital dynamical framework to investigate the cloud-aerosol-chemistry-radiation interaction from the viewpoint of climate problem. So far, the LES using in the meteorological filed are having several problems. One problem was that it is large grid-size used, compromising to the suitability of LES. In addition, the aspect ratio of horizontal and vertical grids was much larger than unity. The grid-size must be reduced to several 10m and it is desirable that the aspect ratio is near unity for the atmospheric LES. The target domain was also narrow for less of computer resources. The large-scale computing using the recent powerful super-computer may enable us to conduct the LES with reasonable grid-size and wide domain. Ultimately, the global LES is one of milestones in near future. Another problem in LES applied on meteorological field is that the heat source owing to water condensation is injected in a grid box. Strictly considering, the grid-box heating collapse the theory of LES that the grid size is in the energy cascade domain. Nevertheless, we have used the dry theory of LES. Beside the above problem that should be resolved in the future, we are now confronting with computational problems for such large-scale calculations. The numerical method of fluid dynamical part in the atmospheric model has been shifted from the spectral transform method to the grid-point method. The former is no longer acceptable on the massively parallel platforms form the limitation of inner-connect communication. On the other hand, the latter also contains a new problem, which is so-called memory bandwidth problem. For example, even on K Computer, the B/F ratio is just 0.5. The key to get high computational performance is the reduction of load/store from and to the main memory and efficient use of cash memory. Similar problem occurs in the communication between computer nodes. The multidisciplinary team (Team SCALE) in RIKE/AICS is now tackling to such problems.
The video for this talk should appear here if JavaScript is enabled.
If it doesn't, something may have gone wrong with our embedded player.
We'll get it fixed as soon as possible.
Presentation Material: 
University of Cambridge Research Councils UK
    Clay Mathematics Institute London Mathematical Society NM Rothschild and Sons