18
Earth System Modeling Framework Workshop on “Coupling Technologies for Earth System Modelling : Today and Tomorrow” CERFACS, Toulouse (France) – Dec 15 th to 17 th 2010 Ryan O’Kuinghttons Robert Oehmke Cecelia DeLuca

New Earth System Modeling Framework · 2012. 7. 5. · Evolution Phase 1: 2002-2005 NASA’s Earth Science Technology Office ran a solicitation to develop an Earth System Modeling

  • Upload
    others

  • View
    2

  • Download
    0

Embed Size (px)

Citation preview

  • Earth System Modeling Framework

    Workshop on “Coupling Technologies for Earth System Modelling : Today and Tomorrow”

    CERFACS, Toulouse (France) – Dec 15th to 17th 2010

    Ryan O’KuinghttonsRobert OehmkeCecelia DeLuca

  • Motivation

    In climate research and numerical weather prediction..

    increased emphasis on detailed representation of individual physical processes; requires

    many teams of specialists to contribute components to an overall modeling system

    In computing technology...

    increase in hardware and software complexity in high-performance computing, as we shift

    toward the use of scalable computing architectures

    In software …

    emergence of frameworks to promote code reuse and interoperability

    The ESMF is a focused community effort to tame the complexity of models and

    the computing environment. It leverages, unifies and extends existing software

    frameworks, creating new opportunities for scientific contribution and collaboration.

  • Evolution

    Phase 1: 2002-2005

    NASA’s Earth Science Technology Office ran a solicitation to develop an Earth System

    Modeling Framework (ESMF).

    A multi-agency collaboration (NASA/NSF/DOE/NOAA) won the award. The core

    development team was located at NCAR.

    A prototype ESMF software package (version 2r) demonstrated feasibility.

    Phase 2: 2005-2010

    New sponsors included Department of Defense and NOAA.

    Many new applications and requirements were brought into the project, motivating a

    complete redesign of framework data structures (version 3r).

    Phase 3: 2010-2015

    The core development team moved to NOAA/CIRES for closer alignment with federal

    models.

    Basic framework development will be complete with version 5r (ports, bugs, feature

    requests, interoperability templates, user support etc. still require resources).

    The focus is on increasing adoption and creating a community of interoperable codes.

  • Components

    • ESMF is based on the idea of components – functionally distinct sections of code that are wrapped in standard interfaces

    • Components may represent either a physical domain or a function

    • Components can be arranged hierarchically, helping to organize the structure of complex models

    GEOS-5

    surface fvcore gravity_wave_drag

    history agcm

    dynamics physics

    chemistry moist_processes radiation turbulence

    infrared solar lake land_ice data_ocean land

    vegetation catchment

    coupler

    coupler coupler

    coupler

    coupler

    coupler

    coupler

    ESMF components in the GEOS-5 atmospheric GCM

  • Architecture

    Low Level Utilities

    Fields and Grids Layer

    Model Layer

    Components Layer

    Gridded Components

    Coupler Components

    ESMF Infrastructure

    User Code

    ESMF Superstructure

    MPI, NetCDF, …External Libraries

    • ESMF provides a superstructure for assembling

    geophysical components into

    applications.

    • ESMF provides an infrastructure that modelers

    use to

    – Generate and apply interpolation weights

    – Handle metadata, time management, data I/O and

    communications, and other

    functions

    – Access third party libraries

  • Standard Interfaces

    • All ESMF components have the same three standard methods:– Initialize– Run– Finalize

    • Each standard method has the same simple interface:call ESMF_GridCompRun (myComp, importState, exportState, clock, …)

    Where:myComp points to the componentimportState is a structure containing input fieldsexportState is a structure containing output fieldsclock contains timestepping information

    Steps to adopting ESMF• Divide the application into components

    (without ESMF)• Copy or reference component input

    and output data into ESMF data structures

    • Register components with ESMF• Set up ESMF couplers for data

    exchange

    • Interfaces are wrappers and can often be setup in a non-intrusive way

  • Data Representation Options

    1. Representation in index space (Arrays)• One or more tiles

    store indices and topology• Sparse matrix multiply for remapping

    with user supplied interpolation weights• Highly scalable - no global information held

    locally, uses distributed directory approach(Devine 2002) for access to randomlydistributed objects in an efficient, scalable way

    2. Representation in physical space (Fields)• Built on Arrays + some form of Grid• Grids may be logically rectangular, unstructured mesh, or observational data• Remapping using parallel interpolation weight generation - bilinear, higher

    order, or first order conservative

    Supported Array distributions

  • Coupling Options• Lots of flexibility in

    coupling approaches

    • Single executable• Multiple executable

    – Array send/rcv withInterCommpackage (PVM)

    – Web service option– Other options

    • Couplingcommunicationscan be called either from within a coupler or directly from a gridded component –useful when it is inconvenient to return from a component in order to perform a coupling operation

    • Recursive components for nesting higher resolution regions• Ensemble management with either concurrent or sequential execution of

    ensemble members

    Comp A Comp B

    Comp A Comp B

    Contributed by U MarylandCoupler

    Array send/recv

    Multiple executable options

  • Metadata Handling and Usage

    •Metadata is broken down into name/value pairs by the Attribute class– Can be attached at any level of the ESMF object hierarchy– Document data provenance to encourage self describing models– Automate some aspects of model execution and coupling

    - Actively exploring in this direction with workflows and web services

    •Standard metadata is organized by Attribute packages – Used to aggregate, store, and output model metadata– Can be nested, distributed, and expanded to suite specific needs

    - Designed around accepted metadata standards

    •Emerging conventions– Climate and Forecast (CF)– ISO standards– METAFOR Common Information Model (CIM)

  • Summary of Features

    • Fast parallel remapping: unstructured or logically rectangular grids, 2D and 3D, using bilinear, higher order, or conservative methods, integrated (during runtime) or offline (from files)

    • Multiple strategies for support of nested grids

    • Core methods are scalable to tens of thousands of processors

    • Supports hybrid (threaded/distributed) programming for optimal performance on many computer architectures

    • Multiple coupling and execution modes for flexibility

    • Time management utility with many calendars (Gregorian, 360-day, no-leap, Julian day, etc.), forward/reverse time operations, alarms, and other features

    • Metadata utility supports emerging standards in flexible and useful way

    • Runs on 25+ platform/compiler combinations, exhaustive test suite and documentation

    • Couples between Fortran and/or C-based model components

  • Class Structure

    DELayout

    Communications

    State

    Data imported or exported

    FieldBundle

    Collection of fields

    GridComp

    Land, ocean, atm, … model

    F90

    Superstructure

    Infrastructure

    Field

    Physical field, e.g. pressure

    Grid, LocStream, Mesh (C++)

    LogRect, Unstruct, etc.

    Data Communications

    C++Regrid

    Computes interp weights

    CplComp

    Xfers between GridComps

    Utilities

    Machine, TimeMgr, LogErr, I/O, Config, Attributes etc.

    Array

    Hybrid F90/C++ arrays

    Route

    Stores comm paths

    DistGrid

    Grid decomposition

    Xgrid

    Exchange grid

  • Component Overhead

    • Representation of the overhead for ESMF wrapped native CCSM4 component

    • For this example, ESMF wrapping required NO code changes to scientific modules

    • No significant performance overhead (< 3% is typical)

    • Few code changes for codes that are modular

    • Platform: IBM Power 575, bluefire, at NCAR• Model: Community Climate System Model

    (CCSM)• Versions: CCSM_4_0_0_beta42 and

    ESMF_5_0_0_beta_snapshot_01 • Resolution: 1.25 degree x 0.9 degree global

    grid with 17 vertical levels for both the atmospheric and land model, i.e. 288x192x17 grid. The data resolution for the ocean model is 320x384x60.

  • Remapping Performance

    • ESMF parallel conservative remapping is scalable and accurate

    • Bilinear could use additional optimization

    • ESMF parallel conservative remapping is scalable and accurate

    • Bilinear could use additional optimization

    • All ESMF interpolation weights are generated with unstructured mesh

    • Increases flexibility with 2D and 3D grids

    • Adds overhead to bilinear interpolation

    • Greatly improves performance over existing conservative methods

    • Platform: Cray XT4, jaguar, at ORNL• Versions: ESMF_5_2_0_beta_snapshot_07 • Resolution: - fv0.47x0.63: CAM Finite Volume grid, 576x384- ne60np4: 0.5 degree cubed sphere grid,

    180x180x6

  • A Common Model Architecture

    • The US Earth system modeling community is converging on a common modeling architecture

    • Atmosphere, ocean, sea ice, land, wave, and other models are ESMF or ESMF-like components called by a top-level driver or coupler

    • Many models are componentizing further

    Navy Coupled Ocean Atmosphere Mesoscale Prediction System / Wavewatch III Navy Operational Global Atmospheric Prediction SystemHybrid Coordinate Ocean Model – CICE Sea Ice

    NOAA National Environmental Modeling SystemNOAA GFDL MOM4 Ocean

    NASA GEOS-5 Atmospheric General Circulation Model

    NCAR Community Earth System ModelWeather Research and Forecast Model

    HAF Kinematic Solar Wind-GAIM Ionosphere

    pWASH123 Watershed-ADCIRC Storm Surge Model

    Features and Benefits:

    •Interoperability promotes code reuse and cross-agency collaboration•Portable, fast, fully featured toolkits enhance capability•Automatic compliance checking for ease of adoption

    ESMF-enabled systems include:

    A Common Model ArchitectureA Common Model Architecture

  • A Common Model Architecture

    • Increasingly, models in the U.S. follow a common architecture

    • Atmosphere, ocean, sea ice, land, and/or wave models are components called by a top-level driver/coupler

    • Components use ESMF or ESMF-like interfaces (see left)

    • Many major U.S. weather and climate models either follow this architecture (CCSM/CESM, COAMPS, NEMS), want to follow this architecture for future coupled systems (NOGAPS), or have a different style of driver but could provide components to this architecture (GEOS-5, FMS)

    Even non-ESMF codes now look like ESMF …

    ESMF:ESMF_GridCompRun(gridcomp, importState,exportState, clock, phase, blockingFlag, rc)

    CESM (non-ESMF version):atm_run_mct(clock, gridcomp, importState,exportState)(argument names changed to show equivalence)

    WRF

    HYCOM

    CICE IcePOP Ocean

    CCSM4/CESM

    NMM-B Atm PhysNMM-B Atm DynamicsNEMS

    NMM History

    GFS Atm Phys GFS Atm DynamicsGFS

    GFS I/O

    FV Cub Sph DycoreGEOS-5 GWD GEOS-5 FV Dycore

    GEOS-5 Atm Dynamics

    GEOS-5

    GSI

    GEOS-5 Moist Proc

    GEOS-5 Turbulence

    GEOS-5 LW Rad GEOS-5 Solar Rad

    GEOS-5 Radiation

    GEOS-5 Aeros Chem

    GOCART

    Strat ChemParam Chem

    GEOS-5 Atm Chem

    GEOS-5 Ocean BiogeoGEOS-5 Salt WaterPoseidon GEOS-5 Data Ocean

    GEOS-5 OGCM

    GEOS-5 Topology

    GEOS-5 Land Ice

    GEOS-5 Lake GEOS-5 Veg Dyn GEOS-5 Catchment

    GEOS-5 Land

    GEOS-5 Surface

    GEOS-5 Atm PhysicsGEOS-5 Hiistory

    ESMF Model Map 2010

    NOAADepartment of DefenseUniversityNASADepartment of EnergyNational Science Foundation

    ESMF coupling complete

    Component (thin lines)Model (thick lines)

    LegendOvals show ESMF components and modelsthat are at the working prototype level orbeyond.

    Tracer Advection

    CLM Land CAM Atm FIM

    Land Info System

    HAF

    GAIM

    MOM4

    SWAN

    ADCIRCpWASH123

    COAMPSWWIII

    NCOM

    NOGAPS

  • NUOPC Layer: Goals

    • National Unified Operational Prediction Capability is a consortium of operation weather prediction centers

    • Standardize implementation of ESMF across NASA, NOAA, Navy, and other model applications

    • Demonstrate improved level of interoperability– Specific goals described in the NUOPC Common Model

    Architecture Committee report

    – CMA report -http://www.weather.gov/nuopc/CMA_Final_Report_1%20Oct%2009_baseline.pdf?

    – NUOPC website - http://www.weather.gov/nuopc

  • NUOPC Layer: Products• Software templates to guide development of a common

    architecture for components and couplers• A software layer to narrow the scope of ESMF interfaces• NUOPC Compliance Checker software (initial

    implementation available with ESMF_5_1_0)• Comprehensive tutorial materials• Websites, repositories, trackers, and other collaborative

    tools

    • NUOPC Layer Guidance Documents (posted on ESMF website)

    • ESMF - www.earthsystemmodeling.org

  • Other ESMF related projects …

    • Earth System Curator (sponsors NSF/NASA/NOAA)– Implementation of the METAFOR Common

    Information Model in the Earth System Grid (ESG) portal for the 5th Coupled Model Intercomparison Project

    – Using ESMF Attributes to generate the METAFOR CIM schema directly from models

    – Atmosphere/hydrological model coupling using OpenMI and ESMF web services

    • Earth Science Gateway on the TeraGrid (sponsor NSF)– End-to-end self-documented workflows from web-

    based model configuration to data archival, with Purdue and NCAR

    • Global Interoperability Program (sponsor NOAA)– Support for projects involved with interoperability,

    infrastructure development, and global modeling education