ICON & EMAC Chemistry, clouds and precipitation, booster usage - JULY 05, 2018 I CATRIN MEYER, OLAF STEIN
←
→
Page content transcription
If your browser does not render page correctly, please read the page content below
ICON & EMAC Chemistry, clouds and precipitation, booster usage JULY 05, 2018 I CATRIN MEYER, OLAF STEIN
ICON The ICOsahedral Non-hydrostatic modelling framework Global Circulation Model for global numerical weather prediction (NWP) and climate modelling, developed by German Weather Service and MPI-Met. • based on non-hydrostatic dynamics • time splitting: dynamical core ↔ tracer advection, physics parametrizations, horizontal diffusion • flexible nesting capabilities (patches) • mass-consistent tracer transport • high computational efficiency and good scaling on massive parallel systems • ICON with chemistry and aerosol components: ICON-ART • Multiple physic packages: ICON-NWP, ICON-A, ICON-LEM (, ICON-LAM) → Allowing for simulation of same situation in different resolutions and settings July 05, 2018 2
ICON ICON-LEM simulations on JUQUEEN and JURECA in HD(CP)² • addresses cloud and precipitation (CP) processes which constitute one of the largest uncertainties in current weather and climate models • utilizes the ICON-LEM (Large Eddy Model) that provides a previously unknown horizontal resolution of 156m to actually resolve CP processes • High-resolution hindcast simulations over very diverse regions, such as Germany and the Tropical Atlantic • Porting to and optimization for massive parallel HPC systems (JUQUEEN) July 05, 2018 4
ICON Performance of ICON on JUQUEEN 120 m horizontal resolution Problems to solve at the beginning: • Memory consumption proportional to number of processes • No prior knowledge of which process needs which data, because of irregular grid Numerous restructuring of the code needed • Testing of different development versions with different horizontal resolutions and different configurations More than 150 tests Result: ICON uses efficiently the entire JUQUEEN machine July 05, 2018 5
ICON Simulation Setup • Cover mainly Germany • Finest resolution: 156 m More than 3.5 bn grid elements Simulation of 24 hours: • Generate 50 TB model output • Produce 16 TB restart files • Needs 3.7 Mio. core-h on JUQUEEN • Runs for 7.8 days on 1024 cores July 05, 2018 6
ICON Scientific Questions addressed in Phase 2 of HD(CP)² • How do clouds respond to perturbations in their aerosol environment? • What are controlling factors for boundary layer clouds? • What are controlling factors for anvil cloud development? • To what extend does land-surface heterogeneity control clouds and precipitation? • To what extend is convective organization important for climate? • How do clouds and convection influence the development of the storm tracks? Sensitivity Studies performed on JURECA to study the influence of microphysics on cloud development Simulation day: 05.07.2025, 624 m horizontal resolution July 05, 2018 7
EMAC ECHAM/MESSy Atmospheric Chemistry • MESSy is a framework for standardized, bottom-up implementation of Earth System Models (or parts of those) with flexible complexity • MESSy provides an infrastructure with generalized interfaces for the standardized coupling of "low-level ESM components" (dynamic cores, physical parameterizations, chemistry packages, diagnostics etc.) → submodels • MESSy comprises currently about 60 submodels • infrastructure (= the framework) submodels • diagnostic submodels • atmospheric chemistry related submodels • model physics related submodels The global atmosphere-chemistry model EMAC is based on ECHAM5 with submodels for chemical and physical processes in the troposphere and the middle atmosphere, their interaction with the ocean and land surfaces, and of anthropogenic influences. July 05, 2018 9
EMAC The MESSy coupler infrastructure Base Model Layer Base Model Interface Layer Submodel Interface Layer Submodel Core July 05, 2018 Seite 10 Layer
COUPLING STRATEGIES For Earth System Models • Online integration comprehensive and seamless ESM model, but less flexible and harder to maintain • Offline (reading data from files) classical approach, e.g. for CTMs, regional models, pre-calculated parameterizations, slow • Online coupling P2P (OASIS, YAC, ESMF, …): Parallel and scalable field exchange using a coupler software based on MPI P2P communication, typically used for 2D boundaries between ESM compartments • 3D Coupling of processes more challenging (e.g. Chemistry) → MESSy: Framework for standardized coupling of ESM submodels (physical processes, chemistry, diagnostics, …) with flexible complexity using generalized interfaces July 05, 2018 11
ICON Coupling for the ICON ESM Internal coupling via ICON interface modules • Physical parameterizations, I/O processes • ART chemistry External coupling software (work in progress) YAC • Coupling ESM components (e.g. land and ocean) at MPI-M OASIS3-MCT • Coupling of surface (CLM) and hydrologic model (Parflow) in TerrSysMP MESSy • Diagnostics, Trajectory Analyses • Comprehensive chemistry and other sub-processes July 05, 2018 12
BOOSTER USAGE IN THE ESM COMMUNITY Chemistry Climate Modelling with EMAC Chemical kinetics typically the most computational resource-demanding subtask EMAC (Alvanos & Christoudias, 2017) • Software package that automatically generates CUDA kernels to numerically integrate atmospheric chemical kinetics in EMAC • A source-to-source compiler outputs a CUDA-compatible kernel by parsing the FORTRAN code generated by the Kinetic PreProcessor (KPP) for atmospheric chemical kinetics. • Good accelerations could be achieved so far for relatively small chemical mechanisms, tests are ongoing on Cy-Tera and JURECA. • The approach can potentially be used as the basis for hardware acceleration of numerous geoscientific models that rely on KPP for atmospheric chemical kinetics applications • Tests are ongoing for KNL booster usage on JURECA July 05, 2018 13
BOOSTER USAGE IN THE ESM COMMUNITY Numerical Weather Prediction boosted by GPUs COSMO (Fuhrer et al, 2018) Since spring 2016, MeteoSwiss operates the deterministic 1.1km mesh-size COSMO-1 as well as the 21-member ensemble-system COSMO-E on a GPU-based supercomputer. 10/2017: Strong scaling to 4888 GPUs on Piz Daint with COSMO 5.0 (hybrid Cray XC50 nodes Haswell/Pascal, 16 GB High Bandwith Memory) Major efforts (in total ~16 P.Y.): • rewriting the dynamical core of the model (solution to the non-hydrostatic Euler equations) from Fortran to C++ • introduce a new C++ template library-based domain specific language (STELLA) • specialized backends of the library produce efficient code for the target computing architecture • STELLA backend written in CUDA • other parts of the refactored COSMO use OpenACC directives (physical parameterizations and diagnostics) ICON will take advantage of COSMO developments ! July 05, 2018 14
EMAC & ICON … in the ESM project • Development of a world-leading Earth system modelling infrastructure • Frontier Simulations as demonstrator of the capability of the ESM infrastructure → Monsoon Systems in a changing climate (EMAC + MPIOM) • transition from ECHAM6 and EMAC towards ICON as the atmospheric component of an emerging Helmholtz ESM • requires additional interfaces for a better use within the Modular Earth Submodel System (MESSy) • further development of components for reactive chemistry and aerosols and for Lagrangian transport • Use zooming capacity of ICON for regional modelling • Coupling to other compartments of the earth system (FESOM + CLM or LPJ-GUESS) July 05, 2018 15
You can also read