3. Components

The MRW Application is a combination of several modeling components that work together to prepare, analyze, produce, and post-process forecast data. The major components of the system are:

  • Build System (CMake)

  • Workflow (global-workflow)

  • Data and Pre-processing (UFS_UTILS)

  • Forecast (UFS Weather Model)

  • Post-processing (Unified Post-Processor [UPP])

Additionally, the MRW Application includes the following optional components:

  • METplus-based Verification Suite

  • Visualization Examples

These components are documented within this User’s Guide and supported through a community forum.

3.1. Build System

The MRW Application includes an umbrella CMake-based build system that assembles the components necessary for running the application. This release is supported for use with Linux and Mac operating systems and with Intel or GNU compilers. There is a small set of system libraries that are assumed to be present on the target computer, including CMake, a compiler, and an MPI library that enables parallelism. For a full list, see Section 1.2.2: Prerequisites in the Introduction.

Prerequisite libraries necessary for the application (e.g., NCEPLIBS and NCEPLIBS-external) are not included in the MRW Application build system but are available pre-built on preconfigured platforms. On other systems, they can be installed as a software bundle via the HPC-Stack. On preconfigured (Level 1) platforms, the MRW App is expected to build and run out of the box. Users can proceed directly to the using the workflow, as described in the Quick Start Guide. On configurable (Level 2) platforms, the software stack is expected to install successfully, but it is not available in a central location. Applications and models are expected to build and run once HPC-Stack has been built. Limited-Test (Level 3) and Build-Only (Level 4) platforms are those in which the developers have built the code but little or no pre-release testing has been conducted, respectively. A complete description of the levels of support, along with a list of preconfigured and configurable platforms can be found here.

3.2. Global Workflow

The MRW Application leverages the Rocoto-based Global Workflow. The Global Workflow repository (global-workflow) contains the workflow layer of the application, which ensures that each task in the experiment runs in the proper sequence. The Global Workflow also allows model components to communicate through driver scripts, provides scripts for specific model component processes, and includes a broad range of configuration files. It provides ways to choose experiment parameters such as the grid resolution, physics namelist options, forecast length in hours, and history file frequency. It also allows for configuration of other elements of the workflow; for example, whether to run some or all of the pre-processing, forecast model, and post-processing steps. For a full accounting of Global Workflow capabilities, visit the Global Workflow wiki.

After running the checkout script in the sorc directory, the Global Workflow also pulls in the code and scripts for the analysis, forecast, and post-processing components. These non-workflow components are known as submodules. Each of the system submodules has its own repository.

3.3. Data and Pre-Processing

The MRW App includes the chgres_cube pre-processing software, which is part of the UFS_UTILS pre-processing utilities package. chgres_cube converts the Global Forecast System (GFS) analyses to the format needed by the Weather Model. GFS Analysis data files are observation files that provide a snapshot of what the state of the atmosphere was at a specific time. Additional information about chgres_cube can be found in the UFS_UTILS Technical Documentation.

GFS analyses for initializing the MRW App can be in one of three formats:

  • Gridded Binary v2 (GRIB2) format (with 0.50, or 1.0 degree grid spacing),

  • The NOAA Environmental Modeling System (NEMS) Input/Output (NEMSIO) format, or

  • Network Common Data Form (NetCDF) format. Initialization from dates starting on January 1, 2018 are supported. Dates before that may work but are not guaranteed.

GFS public archives can be accessed through the THREDDS Data Server at NCEI. A small sample of files in all supported formats can be found at the EMC FTP site. Additionally, public archives of model data can be accessed through the NOAA Operational Model Archive and Distribution System (NOMADS). The initial conditions may be pre-staged on disk by the user; alternatively, users can automatically download the files as part of the Global Workflow if they have access to NOAA HPSS.

Warning

For GFS data, dates prior to 1 January 2018 may work but are not guaranteed.

3.4. Forecast Model

The prognostic model in the MRW App is the atmospheric component of the UFS Weather Model, which employs the Finite-Volume Cubed-Sphere (FV3) dynamical core. The dynamical core is the computational part of a model that solves the equations of fluid motion. The atmospheric model in this release is an updated version of the atmospheric model that is being used in the operational GFS v16. A User’s Guide for the UFS Weather Model can be found here. Additional information about the FV3 dynamical core can be found in the scientific documentation, the technical documentation, and on the NOAA Geophysical Fluid Dynamics Laboratory website.

The UFS Weather Model ingests files produced by chgres_cube and outputs files in netCDF format, which use a Gaussian grid in the horizontal direction and model levels in the vertical direction. Supported grid configurations for this release are the global meshes with resolutions of C48 (~200km), C96 (~100 km), C192 (~50 km), C384 (~25 km), and C768 (~13 km), all with 127 vertical levels. The NOAA Geophysical Fluid Dynamics Laboratory website provides more information about FV3 and its grids.

Table 3.1 Grid resolutions

# Cells

Degrees

Resolution

C48

2 degrees

~200km

C96

1 degree

~ 100km

C192

1/2 degree

~ 50km

C384

1/4 degree

~ 25km

C768

1/8th degree

~ 13km

3.4.1. Physics

Interoperable atmospheric physics, along with various land surface model options, are supported through the Common Community Physics Package (CCPP), described here. Atmospheric physics are a set of numerical methods describing small-scale processes such as clouds, turbulence, radiation, and their interactions. Currently, the global-workflow uses CCPP v6.0.0, which includes the supported GFS_v17_p8 physics suite. This suite is a prototype of the physics suite that will be used in the operational implementation of the Global Forecast System (GFS) v17. It is expected to evolve before its operational implementation in 2024. The GFS v17 physics suite includes improvements to the microphysics paramaterizations, deep cumulus physics, gravity wave drag, and land surface model compared to the GFS v16 physics suite. FV3_GFS_v17_p8 is used with the ATM configurations of the Weather Model, while FV3_GFS_v17_coupled_p8 is used with the subseasonal-to-seasonal (S2S) configurations of the model. A scientific description of the CCPP parameterizations and suites can be found in the CCPP Scientific Documentation, and CCPP technical aspects are described in the CCPP Technical Documentation. The model namelist has many settings beyond the physics suites that can optimize various aspects of the model for use with each of the supported suites.

The use of stochastic processes to represent model uncertainty is also an option in the upcoming release, although the option is off by default in the supported physics suites. Five methods are supported for use separately or in combination: Stochastic Kinetic Energy Backscatter (SKEB), Stochastically Perturbed Physics Tendencies (SPPT), Specific Humidity perturbations (SHUM), Stochastically Perturbed Parameterizations (SPP), and Land Surface Model (LSM) SPP. A User’s Guide for the Stochastic Physics options is available here.

3.5. Unified Post-Processor (UPP)

The Medium-Range Weather (MRW) Application is distributed with a post-processing tool, the Unified Post Processor (UPP). The UPP converts the native netCDF output from the model to GRIB2 format on standard isobaric coordinates in the vertical direction. The UPP can also be used to compute a variety of useful diagnostic fields, as described in the UPP User’s Guide.

The UPP output can be used with visualization, plotting and verification packages, or for further downstream post-processing (e.g., statistical post-processing techniques).

3.6. METplus Verification Suite

The enhanced Model Evaluation Tools (METplus) verification system can be integrated into the MRW App to facilitate forecast evaluation. METplus is a verification framework that spans a wide range of temporal scales (warn-on-forecast to climate) and spatial scales (storm to global). It is supported by the Developmental Testbed Center (DTC).

METplus is included as part of the standard installation of the MRW App prerequisite libraries (either HPC-Stack). It is also preinstalled on many Level 1 systems; existing builds can be viewed here. Additionally, some elements of METplus are incorporated into the MRW App’s Global Workflow via the EMC_verif-global subcomponent. This repository is a wrapper for running METplus within the workflow.

The core components of the METplus framework include the statistical driver, MET, the associated database and display systems known as METviewer and METexpress, and a suite of Python wrappers to provide low-level automation and examples, also called use-cases. MET is a set of verification tools developed for use by the NWP community. It matches up grids with either gridded analyses or point observations and applies configurable methods to compute statistics and diagnostics. Extensive documentation is available in the METplus User’s Guide and MET User’s Guide. Documentation for all other components of the framework can be found at the Documentation link for each component on the METplus downloads page.

Among other techniques, MET provides the capability to compute standard verification scores for comparing deterministic gridded model data to point-based and gridded observations. It also provides ensemble and probabilistic verification methods for comparing gridded model data to point-based or gridded observations. Currently, the EMC_verif-global subcomponent of the MRW App’s Global Workflow supports the use of GDAS and NAM observation files in prepBUFR format for point-based (grid-to-observation) verification. EMC_verif-global also supports use of gridded Climatology-Calibrated Precipitation Analysis (CCPA) 24-hour accumulation data for accumulated precipitation evaluation, and it uses the model’s own analysis file for grid-based (grid-to-grid) verification.

METplus is being actively developed by NCAR/Research Applications Laboratory (RAL), NOAA/Earth Systems Research Laboratories (ESRL), and NOAA/Environmental Modeling Center (EMC), and it is open to community contributions.

3.7. Visualization Example

The MRW Application currently does not include full support for model visualization. A Python script (plot_mrw.py) is provided to create basic visualizations of the model output, and a difference plotting script (plot_mrw_cloud_diff.py) is also included to visually compare two runs for the same domain and resolution. These scripts are available in the plotting_scripts directory of the MRW Application. However, this capability is provided only as an example for users familiar with Python and is currently “use at your own risk.”

The scripts are designed to output graphics in .png format for several standard meteorological variables (i.e., 2-m temperature, hourly precipitation, cloud cover, and 10-m wind) at a user inputted time range on the pre-defined CONUS domain. The scripts can be used to visually verify the reasonableness of a forecast. At this time, users who wish to change the plotting domain will need to manually adjust the code, but support for more domains may be expanded in future releases. The scripts’ comments and the file python_plotting_documentation.txt describe the plotting scripts in more detail. Sample plots are provided for a 48-hour forecast initialized on 8/29/2019 00 UTC using GRIB2, NEMSIO, or netCDF files as input datasets.