Keynote Speakers


John Shalf

Wednesday, 2 July 2014

9:00 - 10:15


Speaker: John Shalf
Lawrence Berkeley National Laboratory


Title: Exascale Programming Challenges: Adjusting to the new normal for computer architecture

Summary

For the past twenty-five years, a single model of parallel programming (largely bulk-synchronous MPI), has for the most part been sufficient to permit translation of this into reasonable parallel programs for more complex applications. In 2004, however, a confluence of events changed forever the architectural landscape that underpinned our current assumptions about what to optimize for when we design new algorithms and applications. We have been taught to prioritize and conserve things that were valuable 20 years ago, but the new technology trends have inverted the value of our former optimization targets. The time has come to examine the end result of our extrapolated design trends and use them as a guide to re-prioritize what resources to conserve in order to derive performance for future applications. This talk will describe the challenges of programming future computing systems. It will then provide some highlights from the search for durable programming abstractions more closely track track emerging computer technology trends so that when we convert our codes over, they will last through the next decade.

About the Speaker

John Shalf is CTO for the National Energy Research Supercomputing Center and also Department Head for Computer Science and Data Sciences at Lawrence Berkeley National Laboratory. Shalf is a co-author of over 60 publications in the field of parallel computing software and HPC technology, including three best papers and the widely cited report “The Landscape of Parallel Computing Research: A View from Berkeley” (with David Patterson and others), as well as “ExaScale Software Study: Software Challenges in Extreme Scale Systems,” which sets the Defense Advanced Research Project Agency’s (DARPA’s) information technology research investment strategy for the next decade. He was a member of the Berkeley Lab/NERSC team that won a 2002 R&D 100 Award for the RAGE robot. Before joining Berkeley Lab in 2000, he was a research programmer at the National Center for Supercomputing Applications at the University of Illinois and a visiting scientist at the Max-Planck-Institut für Gravitationphysick/Albert Einstein Institute in Potsdam, Germany, where he co-developed the Cactus code framework for computational astrophysics.


Masaki Satoh

Thursday, 3 July 2014

9:00 - 10:15


Speaker: Masaki Satoh
Professor, Atmosphere and Ocean Research Institute, the University of Tokyo, Japan


Title: A Super High-Resolution Global Atmospheric Simulation by the Nonhydrostatic Icosahedral Atmospheric Model Using the K Computer

Summary

In the field of the atmospheric science, general circulation models (GCMs) have been used to simulate global atmospheric circulations. GCMs use a discretized formulation of equations for fluids and integrate it for winds, temperature, and humidity. Recently, GCMs are coupled with ocean, land surface, and eco-system models to construct Earth system models. They are also used for numerical weather forecasts at meteorological operational agencies. Until very recently, horizontal resolution of GCMs is around several hundred kilometers. As computer powers increase, resolutions of GCMs become higher and more complicated processes are introduced. Our research team has developed a new type of the global atmospheric model called NICAM (non-hydrostatic Icosahedral atmospheric model) which covers the Earth with a quasi-uniform mesh [1], whose horizontal interval can be a sub-kilometer using the K computer [2]. The model can be called a global cloud resolving model (GCRM). With this model, global cloud distributions are reproduced well as shown by Fig. 1. The target of GCRM is multi-scale and multi-physics atmospheric phenomena whose scales are from a sub-kilometers to several ten-thousand kilometers with interactive processes of cloud microphysics, radiation, and turbulence [3]. This presentation overviews recent results from the super-high resolution simulations with NICAM using the K computer.

Slides PPTX

Figure. 1: Cloud image at Aug. 25, 2012 simulated by a global 870m-mesh non-hydrostatic model NICAM using the K-computer. Typhoon Bolaven is reproduced (courtesy of R. Yoshida at AICS, RIKEN). Figure. 1: Cloud image at Aug. 25, 2012 simulated by a global 870m-mesh non-hydrostatic model NICAM using the K-computer. Typhoon Bolaven is reproduced (courtesy of R. Yoshida at AICS, RIKEN).
  1. Satoh, M., T. Matsuno, H. Tomita, H. Miura, T. Nasuno, and S. Iga, 2008: Nonhydrostatic Icosahedral Atmospheric Model (NICAM) for global cloud resolving simulations, J. Comp. Phys., 227, 3486-3514.
  2. Miyamoto, Y., Kajikawa, Y., Yoshida, R., Yamaura, T., Yashiro, H., and Tomita, H., 2013,Deep moist atmospheric convection in a sub-kilometer global simulation. Geophys. Res. Lett., 40, 4922-4926. DOI:10.1002/grl.50944.
  3. Miura, H., M. Satoh, T. Nasuno, A. T. Noda, and K. Oouchi, 2007: A Madden-Julian Oscillation event realistically simulated by a global cloud-resolving model, Science, 318, 1763-1765.

About the Speaker

Education:

Experience: Publications: