2. Workflow Quick Start

This Quick Start Guide will help users to build and run the “out-of-the-box” case for the Unified Forecast System (UFS) Medium-Range Weather (MRW) Application on preconfigured (Level 1) machines. This chapter uses uncoupled, free-forecast mode.

2.1. Building the UFS Medium-Range Weather Application

  1. Clone the MRW App from GitHub:

    git clone -b master https://github.com/ufs-community/ufs-mrweather-app.git
    
  2. Check out the external repositories:

    cd ufs-mrweather-app
    ./manage_externals/checkout_externals
    
  3. Build the global-workflow components (including the UFS Weather Model):

    sh build_global-workflow.sh [-a <UFS_app>] [-c <config_file>] [-v] [-h]
    
    where:
    • -a: Builds a specific UFS app instead of the default. Valid values: S2SWA (default) | ATM | ATMW | S2S | S2SW

    • -c: Selectively builds based on the provided config file instead of the default config.

    • -v: Builds verbose option.

    • -h: Prints usage and exits.

    Users who run sh build_global-workflow.sh without any options will build the default option, S2SWA, which stands for Subseasonal to Seasonal with Wave and Aerosol capabilities.

2.2. Running the UFS Medium-Range Weather Application

  1. Download and stage data according to the instructions in Chapter 4.3 (if using new data or when operating on a Level 2-4 system).

  2. From the global-workflow/ush/rocoto directory, load the appropriate modules:

    cd path/to/ufs-mrweather-app/global-workflow/ush/rocoto
    

    On Orion:

    module load contrib/0.1
    module load rocoto/1.3.3
    

    On Hera:

    module use -a /contrib/anaconda/modulefiles
    module load anaconda/anaconda3-5.3.1
    

    On other Level 1 systems, users can run module spider to view the location of the rocoto modules.

  3. Run the ./setup_expt.py experiment generator script:

    ./setup_expt.py forecast-only --pslot <experiment_name> [--app <valid_app>] --idate <YYYYMMDDHH> --edate <YYYYMMDDHH> --resdet <desired_resolution> --gfs_cyc <\#> --comrot <PATH_TO_YOUR_COMROT_DIR> --expdir <PATH_TO_YOUR_EXPDIR>
    
    where:
    • Valid app values are: ATM (default) | ATMW | S2S | S2SW | S2SWA

    • Valid resdet values are: 48, 96, 192, 384, 768

    • --idate and --edate are the same and refer to the initial start time of the experiment.

    • Valid values for gfs_cyc are: 0 (data assimilation only), 1 (00z only), 2 (00z and 12z), and 4 (00z, 06z, 12z, 18z)

    For example:

    ./setup_expt.py forecast-only --pslot test --app ATM --idate 2020010100 --edate 2020010100 --resdet 384 --gfs_cyc 1 --comrot /home/$USER/COMROT --expdir /home/$USER/EXPDIR
    

    This will generate COMROT and EXPDIR directories. Additionally, it will create a $PSLOT (specific experiment name) subdirectory within COMROT and EXPDIR and a collection of config files in $EXPDIR/$PSLOT.

  4. Copy initial conditions (IC) files into $COMROT/$PSLOT.

    cp <ICfile> $COMROT/$PSLOT
    

    where <ICfile> refers to a given IC file (copy an entire directory by adding the -r argument). These files should be placed within a directory named according to the gfs.YYYYMMDDHH convention with a filename structure like gfs.$YYYYMMDD/HH/atmos/INPUT. The INPUT folder within .../atmos/ contains sfc files needed for the Global Forecast System (GFS) atmospheric model (ATM) to run.

  5. Edit config.base in $EXPDIR/$PSLOT. In particular, users will need to check/modify the following parameters: ACCOUNT, HOMEDIR, STMP, PTMP, HPSSARCH, SDATE, EDATE, and the number 384 in the export FHMAX_GFS_##=${FHMAX_GFS_##:-384} statement whose ## value corresponds to the start hour of the experiment cycle. 384 should be adjusted to reflect the length of the forecast experiment.

  6. Run the following to generate a crontab and .xml files for the experiment in $EXPDIR/$PSLOT:

    ./setup_workflow_fcstonly.py --expdir $EXPDIR/$PSLOT
    
  7. Submit job through crontab by copying entry in $PSLOT.crontab into crontab via crontab -e.

  8. Monitor status of workflow using rocotostat:

    rocotostat -d </path/to/workflow/database/file> -w </path/to/workflow/xml/file> [-c YYYYMMDDHHmm,[YYYYMMDDHHmm,...]] [-t taskname,[taskname,...]] [-s] [-T]
    

    where -c and -t are optional arguments referring to the cycle and task name, respectively.

    For example:

    rocotostat -d $PSLOT.db -w $PSLOT.xml
    
  9. Check status of specific task/job:

    rocotocheck -d </path/to/workflow/database/file> -w </path/to/workflow/xml/file> -c <YYYYMMDDHHmm> -t <taskname>