Introduction to the Earth System Modeling Framework - PowerPoint by GU9jBy8C

VIEWS: 0 PAGES: 17

									Coordination of Common
Modeling Infrastructure

                                      Climate
                       Data
                       Assimilation

Cecelia DeLuca                 Weather
WGCM/WMP Meeting, Exeter, UK
cdeluca@ucar.edu
Oct 6, 2005
Outline
•   What is ESMF?
•   How Do ESMF and PRISM Differ?
•   Why Do ESMF and PRISM Differ?
•   Can ESMF and PRISM Be Usefully Combined?
•   Model Metadata and Earth System Curator
•   How Can WMP Help?
ESMF Background
ESMF grew out of the now defunct Common Modeling Infrastructure Working Group,
which involved many operational and research centers in the U.S.
(Steve Zebiak and Robert Dickenson chairs).

Three linked proposals were           Original ESMF applications:
funded by NASA ESTO in 2002:              NOAA GFDL atmospheres
  1. Core framework                       NOAA GFDL MOM4 ocean
     (Killeen/NCAR)                       NOAA NCEP atmosphere, analyses
  2. Modeling applications                NASA GMAO models and GEOS-5
     (Marshall/MIT)                       NASA/COLA Poseidon ocean
                                          LANL POP ocean and CICE
  3. Data assimilation applications       NCAR WRF
     (da Silva/NASA GSFC)                 NCAR CCSM
                                          MITgcm atmosphere and ocean
New ESMF-Based Programs
Funding for Science, Adoption, and Core Development
Modeling, Analysis and Prediction Program for                Battlespace Environments Institute
Climate Variability and Change                               Sponsor: Department of Defense
Sponsor: NASA                                                Partners:
Partners:                                                    DoD Naval Research Laboratory, DoD Fleet Numerical,
University of Colorado at Boulder, University of Maryland,   DoD Army ERDC, DoD Air Force Air Force Weather Agency
Duke University, NASA Goddard Space Flight Center,           The Battlespace Environments Institute is developing
NASA Langley, NASA Jet Propulsion Laboratory,                integrated Earth and space forecasting systems that use
Georgia Institute of Technology, Portland State              ESMF as a standard for component coupling.
University, University of North Dakota, Johns Hopkins
University, Goddard Institute for Space Studies,
University of Wisconsin, Harvard University, more            Spanning the Gap Between Models and
The NASA Modeling, Analysis and Prediction Program           Datasets:
will develop an ESMF-based modeling and analysis
                                                             Earth System Curator
environment to study climate variability and change.
                                                             Sponsor: NSF
Integrated Dynamics through Earth’s                          Partners:
                                                             Princeton University, Georgia Institute of Technology,
Atmosphere and Space Weather Initiatives
                                                             Massachusetts Institute of Technology, PCMDI, NOAA
Sponsors: NASA, NSF
                                                             GFDL, NOAA PMEL, DOE ESG
Partners: University of Michigan/SWMF, Boston
                                                             The ESMF team is working with data specialists to create
University/CISM, University of Maryland, NASA
                                                             an end-to-end knowledge environment that encompasses
Goddard Space Flight Center, NOAA CIRES
                                                             data services and models.
ESMF developers are working with the University of
Michigan and others to develop the capability to couple
together Earth and space software components.
What is ESMF?
• ESMF provides tools for turning model codes                   ESMF Superstructure
  into components with standard interfaces and                       AppDriver
                                                     Component Classes: GridComp, CplComp, State
  standard drivers.
• ESMF provides data structures and common                             User Code
  utilities that components use for routine
  services such as data communications,                          ESMF Infrastructure
  regridding, time management, configuration,               Data Classes: Bundle, Field, Grid, Array
                                                     Utility Classes: Clock, LogErr, DELayout, Machine
  and message logging.

 Outputs and outcomes …
 •   Open-source, collaboratively developed software utilities and coupling interfaces,
     exhaustive test suite, documentation, support and training.
 •   A federation of geophysical components that can be assembled in multiple ways,
     using different drivers and different couplers.
 •   A Earth science organization that has focused interactions at many levels: software
     engineer and support scientist, technical and scientific manager, scientist, director,
     sponsor.
 •   An extended community with strong connections and many diverse science options.
ESMF Components and Couplers
    Application Example:
      GEOS-5 AGCM




•   Each box is a user-written ESMF component
•   Every component has a standard interface so that it is (technically) swappable
•   Data in and out of components are packaged as state types with user-defined fields
•   New components can easily be added to the hierarchical system
•   Many different structures can be assembled by switching the tree around
  But!
• It is possible to do a “wrap” of an existing model with ESMF, without needing to change
                I
  internal data structures, by just creating one Component box
• This is generally lightweight in terms of performance
• Users can choose to use all of ESMF or just some of it



                                                            • Measures overhead of
                                                              ESMF superstructure in
                                                              NCEP Spectral Statistical
                                                              Analysis (SSI), ~1%
                                                              overall
                                                            • Run on NCAR IBM
                                                            • Runs done by JPL staff,
                                                              confirmed by NCEP
                                                              developers
ESMF Development Status
• Concurrent or sequential execution, single or multiple executable
• Support for configuring ensembles
• Logically rectangular grids with regular and arbitrary distributions can be
  represented and regular distributions can be regridded
• On-line parallel regridding (bilinear, 1st order conservative) implemented and
  optimized
• Other parallel methods - e.g. halo, redistribution, low-level comms implemented
• Utilities such as time manager, logging, and configuration manager usable and
  adding features
• Fortran interfaces and complete documentation, some C++ interfaces


ESMF software is not yet a hardened, out-of-the-box
 solution
ESMF Platform Support
•   IBM AIX (32 and 64 bit addressing)
•   SGI IRIX64 (32 and 64 bit addressing)
•   SGI Altix (64 bit addressing)
•   Cray X1 (64 bit addressing)
•   Compaq OSF1 (64 bit addressing)
•   Linux Intel (32 and 64 bit addressing, with mpich and lam)
•   Linux PGI (32 and 64 bit addressing, with mpich)
•   Linux NAG (32 bit addressing, with mpich)
•   Linux Absoft (32 bit addressing, with mpich)
•   Linux Lahey (32 bit addressing, with mpich)
•   Mac OS X with xlf (32 bit addressing, with lam)
•   Mac OS X with absoft (32 bit addressing, with lam)
•   Mac OS X with NAG (32 bit addressing, with lam)

•   User-contributed g95 support
Current Challenges
Refocus core development team
• Base infrastructure is complete – now need support for unstructured grids,
   multi-block grids with complex boundary behavior (e.g. tripole, cubed sphere),
   more regridding options, and constructs for data assimilation
• Team composition must change correspondingly
• Better, smarter testing – suite of 1600 unit tests, 15 system tests, 30+ examples
   still needs supplements
• Major increase in demand for customer support and training
Many new requirements
• Commercial tool for tracking requirements (DOORS)
• New representative body for prioritizing development tasks (Change Review
   Board)


Organizationally and technically, ESMF infrastructure
  will take another 3-5 years to mature
ESMF v PRISM

     Run-time environment                            PRISM
            ESMF Superstructure
   Coupling Superstructure
                 AppDriver
 Component Classes: GridComp, CplComp, State



               User Code
                User Code


             ESMF Infrastructure
       Utility Infrastructure
        Data Classes: Bundle, Field, Grid, Array
 Utility Classes: Clock, LogErr, DELayout, Machine
                                                        ESMF
Other Differences …
                ESMF                                             PRISM
            Seasonal Forecast
                                                                  Coupler
                                                    Comp      Comp      Comp      Comp

  ocean       sea ice   assim_atm                  • Components are generally in separate
                                                     executables
                    assim         atmland          • Components are generally not nested
  coupler
                            atm             land
                                                   • Single coupler
                                                   • Data is transferred through put/get
• Components are generally in the same
                                                   • Data can go from anywhere to
  executable
                                                     anywhere in another component
• Components are often nested
• Multiple couplers
• Data is passed through states at the
  beginning and end of method
  execution
Motivation for Common Modeling
Infrastructure
                                                              PRISM
• Support for modeling workflows (e.g. job submission, version
  control, annotation and archival of experiments, integration with
  visualization and analysis tools)
• Model intercomparison and interchange of model components
• Better utilization of compute resources and performance
  optimization
• Cost effectiveness: shared, fully featured common utilities (e.g.
  logging, timing, regridding, calendars, I/O, parallelization tools)
• Systematic internal architecture of multi-component models,
  support for many different drivers and configurations        ESMF
Why Do ESMF and PRISM Differ?
For both ESMF and PRISM, overall design was decided by a large
   group of experienced modelers… so how did the two efforts wind
   up with such different solutions?

• PRISM single-driver approach leads to greater effective
  interoperability for a constrained (climate) domain
• ESMF approach leads to limited interoperability for a broader set
  of domains: climate, weather, space weather, data assimilation –
  support for seamless prediction


Both ESMF and PRISM face similar requirements – but
  have taken different paths to fulfill them
Can ESMF and PRISM be Usefully
Combined?
•   ESMF can use PRISM run-time elements
•   PRISM can use the ESMF utility layer
•   ESMF can offer a put/get paradigm for greater flexibility
•   ESMF components can be described using PRISM PMIOD
    files (XML description of model inputs/outputs and content),
    and ESMF data transfers expressed as PRISM put/gets, so
    that the same component can run in both systems (done
    with MOM4)
Model Metadata and Earth
System Curator
Earth System Curator takes the interaction of ESMF/PRISM a step further:
• Recognize models and datasets are described by similar metadata
• Develop standards for model metadata, especially in the area of grids
• Work with umbrella groups developing metadata standards (e.g. GO-ESSP) to
    integrate model and data metadata
• Work with groups developing ontologies (LEAD, ESML) to invest metadata
    standards with structure and flexibility
• Work with GFDL, CCSM and PCMDI to link databases that store models,
    experiments, and data to serve MIPs and IPCC
Anticipated result:
• Coordinated growth of ESMF and PRISM
• Opportunities to develop smarter tools (e.g. compatibility, assembly) based on
    metadata information
How Can WMP Help?
• Support and promote common modeling infrastructure
   ◦ Maintain a science-driven methodology
   ◦ Emphasize long-term investment and continuity
   ◦ Communicate expectations – the “plug and play” myth
• Support and promote efforts to generate metadata standards
  and ontologies
   ◦ For the interaction of ESMF and PRISM
   ◦ For the development of a more comprehensive and useful
      modeling environment
• Help determine how to utilize infrastructure as an entry point into
  the broader (international) modeling community

								
To top