5.2.11.2. CyclonePlotter: Use Case for OPC (EMC) cyclone data

model_applications/tc_and_extra_tc/CyclonePlotter_fcstGFS_obsGFS_OPC.conf

Scientific Objective

Once this method is complete, a user-created extra TC track file for the valid date of interest (YYYYMMDDHH) will have been created, paired up by TCPairs, and global storm tracks for the valid date of interest will be plotted by CyclonePlotter (PlateCaree projection)

Datasets

Forecast: Adeck
/path/to/{init?fmt=%Y}/trak.gfso.atcf_gen.glbl.{init?fmt=%Y}
Observation: Bdeck
/path/to/{init?fmt=%Y}/trak.gfso.atcf_gen.glbl.{init?fmt=%Y}
Location: All of the input data required for this use case can be found in the met_test sample data tarball. Click here to the METplus releases page and download sample data for the appropriate release: https://github.com/dtcenter/METplus/releases
The tarball should be unpacked into the directory that you will set the value of INPUT_BASE. See Running METplus section for more information.
Data Source: GFS

External Dependencies

You will need to use a version of Python 3.6+ that has the following packages installed:

  • cartopy

  • matplotlib

METplus Components

This use case utilizes Python user script-created output files that are accessible via the TCPairs wrapper. Due to the nature of the source file (already tracked extra TCs), the TCPairs wrapper is passed the “Adeck” file for each storm twice: once as the adeck or forecast file, and once as the bdeck or analysis file. Essentially, TCPairs is matching a forecast to itself. It then uses the CyclonePlotter wrapper to create a global plot of storm tracks for the desired day of interest (YYYYMMDDHH).

METplus Workflow

TCPairs is the first tool called in this example. It processes the following run times for each storm file:

Init/Valid: 2020100700

CyclonePlotter is the second (and final) tool called in this example. It processes the output from TCPairs.

METplus Configuration

METplus first loads all of the configuration files found in parm/metplus_config, then it loads any configuration files passed to METplus via the command line with the -c option, i.e. -c /path/to/TCPairs_extra_tropical.conf

#
#  CONFIGURATION
#
[config]

# Looping by times: steps through each 'task' in the PROCESS_LIST for each
# defined time, and repeats until all times have been evaluated.
LOOP_ORDER = processes

# Configuration files
TC_PAIRS_CONFIG_FILE = {PARM_BASE}/met_config/TCPairsConfig_wrapped

# 'Tasks' to be run
PROCESS_LIST = UserScript, TCPairs, CyclonePlotter

LOOP_BY = INIT

# The init time begin and end times, increment, and last init hour.
INIT_TIME_FMT = %Y%m%d%H
INIT_BEG = 2020100700
INIT_END = 2020100700

# This is the step-size. Increment in seconds from the begin time to the end time
# set to 6 hours = 21600 seconds
INIT_INCREMENT = 21600

USER_SCRIPT_RUNTIME_FREQ = RUN_ONCE_PER_INIT_OR_VALID

USER_SCRIPT_PATH = {PARM_BASE}/use_cases/model_applications/tc_and_extra_tc/CyclonePlotter_fcstGFS_obsGFS_OPC/extract_opc_decks.py

USER_SCRIPT_INPUT_PATH = {INPUT_BASE}/model_applications/tc_and_extra_tc/CyclonePlotter_fcstGFS_obsGFS_OPC/trak.gfso.atcf_gen.glbl.{init?fmt=%Y}

USER_SCRIPT_COMMAND = {USER_SCRIPT_PATH} {USER_SCRIPT_INPUT_PATH} {USER_SCRIPT_OUTPUT_DIR} {init?fmt=%Y%m%d%H}

# A list of times to include, in format YYYYMMDD_hh
TC_PAIRS_INIT_INCLUDE =

# A list of times to exclude, in format YYYYMMDD_hh
TC_PAIRS_INIT_EXCLUDE =

# Specify model init time window in format YYYYMM[DD[_hh]]
# Only tracks that fall within the initialization time window will be used
TC_PAIRS_INIT_BEG =
TC_PAIRS_INIT_END =

# Specify model valid time window in format YYYYMM[DD[_hh]]
# Only tracks that fall within the valid time window will be used
TC_PAIRS_VALID_BEG =
TC_PAIRS_VALID_END =

#
# Run MET tc_pairs by indicating the top-level directories for the A-deck and B-deck files. Set to 'yes' to
# run using top-level directories, 'no' if you want to run tc_pairs on files paired by the wrapper.
TC_PAIRS_READ_ALL_FILES = no

# set to true or yes to reformat track data into ATCF format expected by tc_pairs
TC_PAIRS_REFORMAT_DECK = no

# OVERWRITE OPTIONS
# Don't overwrite filter files if they already exist.
# Set to yes if you do NOT want to override existing files
# Set to no if you do want to override existing files
TC_PAIRS_SKIP_IF_OUTPUT_EXISTS = no

#
# MET TC-Pairs
#
# List of models to be used (white space or comma separated) eg: DSHP, LGEM, HWRF
# If no models are listed, then process all models in the input file(s).
MODEL =

#TC_PAIRS_DESC =

# List of storm ids of interest (space or comma separated) e.g.: AL112012, AL122012
# If no storm ids are listed, then process all storm ids in the input file(s).
TC_PAIRS_STORM_ID = 

# Basins (of origin/region).  Indicate with space or comma-separated list of regions, eg. AL: for North Atlantic,
# WP: Western North Pacific, CP: Central North Pacific, SH: Southern Hemisphere, IO: North Indian Ocean, LS: Southern
# Hemisphere
TC_PAIRS_BASIN =

# Cyclone, a space or comma-separated list of cyclone numbers. If left empty, all cyclones will be used.
TC_PAIRS_CYCLONE =

# Storm name, a space or comma-separated list of storm names to evaluate.  If left empty, all storms will be used.
TC_PAIRS_STORM_NAME =

# DLAND file, the full path of the file that contains the gridded representation of the
# minimum distance from land.
TC_PAIRS_DLAND_FILE = MET_BASE/tc_data/dland_global_tenth_degree.nc

# setting this so that when verifying against analysis track, the union of points are written
TC_PAIRS_MET_CONFIG_OVERRIDES = match_points = FALSE;

##
# only 00, 06, 12, and 18z init times are supported in NOAA website,
# so for consistency, these are the only options for METplus.
#
CYCLONE_PLOTTER_INIT_DATE={init?fmt=%Y%m%d}
CYCLONE_PLOTTER_INIT_HR ={init?fmt=%H}
CYCLONE_PLOTTER_MODEL = GFSO
CYCLONE_PLOTTER_PLOT_TITLE = Model Forecast Storm Tracks

##
#  Indicate the size of symbol (point size)
CYCLONE_PLOTTER_CIRCLE_MARKER_SIZE = 2
CYCLONE_PLOTTER_CROSS_MARKER_SIZE = 3

##
# Indicate text size of annotation label
CYCLONE_PLOTTER_ANNOTATION_FONT_SIZE=3

##
# Resolution of saved plot in dpi (dots per inch)
# Set to 0 to allow Matplotlib to determine, based on your computer
CYCLONE_PLOTTER_RESOLUTION_DPI = 400

CYCLONE_PLOTTER_GENERATE_TRACK_ASCII = yes

CYCLONE_PLOTTER_ADD_WATERMARK = False

#
#  DIRECTORIES
#
[dir]
# Location of input track data directory
# for ADECK and BDECK data

USER_SCRIPT_OUTPUT_DIR = {OUTPUT_BASE}/decks

TC_PAIRS_ADECK_INPUT_DIR = {USER_SCRIPT_OUTPUT_DIR}/adeck
TC_PAIRS_BDECK_INPUT_DIR = {USER_SCRIPT_OUTPUT_DIR}/adeck

TC_PAIRS_OUTPUT_DIR = {OUTPUT_BASE}/tc_pairs

CYCLONE_PLOTTER_INPUT_DIR = {TC_PAIRS_OUTPUT_DIR}
CYCLONE_PLOTTER_OUTPUT_DIR = {OUTPUT_BASE}/cyclone

[filename_templates]
TC_PAIRS_ADECK_TEMPLATE = adeck.{init?fmt=%Y%m%d%H}.{cyclone}.dat
TC_PAIRS_BDECK_TEMPLATE = adeck.{init?fmt=%Y%m%d%H}.{cyclone}.dat
TC_PAIRS_OUTPUT_TEMPLATE = tc_pairs.{init?fmt=%Y%m%d%H}.{cyclone}

MET Configuration

METplus sets environment variables based on user settings in the METplus configuration file. See How METplus controls MET config file settings for more details.

YOU SHOULD NOT SET ANY OF THESE ENVIRONMENT VARIABLES YOURSELF! THEY WILL BE OVERWRITTEN BY METPLUS WHEN IT CALLS THE MET TOOLS!

If there is a setting in the MET configuration file that is currently not supported by METplus you’d like to control, please refer to: Overriding Unsupported MET config file settings

Note

See the TCPairs MET Configuration section of the User’s Guide for more information on the environment variables used in the file below:

////////////////////////////////////////////////////////////////////////////////
//
// Default TCPairs configuration file
//
////////////////////////////////////////////////////////////////////////////////

//
// ATCF file format reference:
//   http://www.nrlmry.navy.mil/atcf_web/docs/database/new/abrdeck.html
//

//
// Models
//
${METPLUS_MODEL}

//
// Description
//
${METPLUS_DESC}

//
// Storm identifiers
//
${METPLUS_STORM_ID}

//
// Basins
//
${METPLUS_BASIN}

//
// Cyclone numbers
//
${METPLUS_CYCLONE}

//
// Storm names
//
${METPLUS_STORM_NAME}

//
// Model initialization time windows to include or exclude
//
${METPLUS_INIT_BEG}
${METPLUS_INIT_END}
${METPLUS_INIT_INCLUDE}
${METPLUS_INIT_EXCLUDE}

//
// Valid model time window
//
${METPLUS_VALID_BEG}
${METPLUS_VALID_END}

//
// Model initialization hours
//
init_hour = [];

//
// Required lead time in hours
//
lead_req = [];

//
// lat/lon polylines defining masking regions
//
init_mask  = "";
valid_mask = "";

//
// Specify if the code should check for duplicate ATCF lines
//
check_dup = FALSE;

//
// Specify special processing to be performed for interpolated models.
// Set to NONE, FILL, or REPLACE.
//
interp12 = REPLACE;

//
// Specify how consensus forecasts should be defined
//
consensus = [];

//
// Forecast lag times
//
lag_time = [];

//
// CLIPER/SHIFOR baseline forecasts to be derived from the BEST
// and operational (CARQ) tracks.
//
best_technique = [ "BEST" ];
best_baseline  = [];
oper_technique = [ "CARQ" ];
oper_baseline  = [];

//
// Specify the datasets to be searched for analysis tracks (NONE, ADECK, BDECK,
// or BOTH).
//
anly_track = BDECK;

//
// Specify if only those track points common to both the ADECK and BDECK
// tracks be written out.
//
match_points = TRUE;

//
// Specify the NetCDF output of the gen_dland tool containing a gridded
// representation of the minimum distance to land.
//
${METPLUS_DLAND_FILE}

//
// Specify watch/warning information:
//   - Input watch/warning filename
//   - Watch/warning time offset in seconds
//
watch_warn = {
   file_name   = "MET_BASE/tc_data/wwpts_us.txt";
   time_offset = -14400;
}

//
// Indicate a version number for the contents of this configuration file.
// The value should generally not be modified.
//
//version = "V9.0";

${METPLUS_MET_CONFIG_OVERRIDES}

Python Embedding

This use case uses a Python embedding script to read input data. Because the source file already contains “analysis” tracks for the extra TCs, this Python script only needs to output storm tracks that have a valid time matching the user input. These storms are put into separate storm files, to better mimic how TC storms are typically passed to TCPairs.

parm/use_cases/model_applications/tc_and_extra_tc/CyclonePlotter_fcstGFS_obsGFS_OPC/extract_opc_decks.py

#! /usr/bin/env python3

#
#  program extrack_opc_decks.py
#
#  reads in EMC 2020 cyclone data 
#  takes 3 command line arguments
#  1) input file (full path, eg, "/d2/projects/d2/projects/extra-tc_verif/gpfs/dell1/nco/ops/com/gentracks/prod/gentracks/{init?fmt=%Y}/trak.gfso.atcf_gen.glbl.{init?fmt=%Y}"
#  2) output directory (eg "{OUTPUT_BASE}/decks")
#  3) init time (YYYYMMDDHH)
#
#  reads all data in input file, creates ADECK using all points valid at init time (key 'YYYYMMDDHH', creates BDECK
#    using key ('STORMNAME') for all storms in ADECK where forecast key ('TAU') = '000' or 0 hrs
#  writes a single adeck and a single bdeck file containing all storms
#
#  further processed by TC_Pairs (extra-tropical) and CyclonePlotter in single use-case wrapper CyclonePlotter_fcst_GFS_obsGFS_OPC
#
#  written February 2021 by George McCabe (mccabe@ucar.edu)
#

import sys
import os
import pandas as pd

# column names/dictionary keys for the trak.data file
atcf_headers_trak=['BASIN','CYCLONE','STORMNAME','YYYYMMDDHH','TECHNUM/MIN','TECH','TAU','LAT','LON',
                   'VMAX','MSLP','TY','RAD','WINDCODE','RAD1','RAD2','RAD3','RAD4','POUTER',
                   'ROUTER','RMW','GUSTS','EYE','SUBREGION','MAXSEAS','INITIALS','DIR','SPEED','F1','F2',
                   'STORMNAME2','DEPTH','SEAS','SEASCODE','SEAS1','SEAS2','SEAS3','SEAS4']

# needs exactly 3 arguments (see above)
num_args = len(sys.argv) - 1

if num_args < 3:
    print("ERROR: Not enough arguments")
    sys.exit(1)
debug = 'debug' in sys.argv
# function to compare storm warning time to search time
def is_equal(column_val, search_string):
    return str(column_val).strip() == search_string

input_file = sys.argv[1]
output_dir = sys.argv[2]
search_date = sys.argv[3]

if debug:
    print(f"Running {__file__}\nSearch date: {search_date}")

# get 2 digit year to use in CYCLONE column substitute value
search_year = search_date[2:4]

# string to use in output file names for filtered adeck and bdeck files
file_prefix = f'deck.{search_date}.'

# an intermediate directory path for the separate files
adeck_base = os.path.join(output_dir, "adeck")
#bdeck_base = os.path.join(output_dir, "bdeck")

# create output directories if not already there
if not os.path.exists(adeck_base):
    print(f"Creating output directory: {adeck_base}")
    os.makedirs(adeck_base)

#if not os.path.exists(bdeck_base):
#    print(f"Creating output directory: {bdeck_base}")
#    os.makedirs(bdeck_base)

# using pandas (pd), read input file
print(f"Reading input file: {input_file}")
pd_data = pd.read_csv(input_file, names=atcf_headers_trak)

print(f"Filtering data...")

# get all 0 hour analyses data
print(f"Filtering data 0 (hr) in TAU (forecast hour) column for bdeck")
pd_0hr_data = pd_data[pd_data['TAU'] == 0]

# get adeck - all lines that match the desired date for YYYYMMDDHH (init time)
print(f"Filtering data with {search_date} in YYYYMMDDHH column for adeck")
init_matches = pd_data['YYYYMMDDHH'].apply(is_equal,
                                           args=(search_date,))
adeck = pd_data[init_matches]

# get list of STORMNAMEs from adeck data
all_storms = adeck.STORMNAME.unique()

# initialize counter to use to set output filenames with "cyclone" number
# to keep storms in separate files
index = 0

# loop over storms
for storm_name in all_storms:
    index_pad = str(index).zfill(4)

    # remove whitespace at beginning of storm name
    storm_name = storm_name.strip()

    # get 0hr data for given storm to use as bdeck
    storm_b_match = pd_0hr_data['STORMNAME'].apply(is_equal,
                                                   args=(storm_name,))
    storm_bdeck = pd_0hr_data[storm_b_match]
    if debug:
        print(f"Processing storm: {storm_name}")
    wrote_a = wrote_b = False

    #Logic for writing out Analysis files. Currently commented out,
    #but left in for possible future use
    if not storm_bdeck.empty:
    #    bdeck_filename = f'b{file_prefix}{index_pad}.dat'
    #    bdeck_path = os.path.join(bdeck_base, bdeck_filename)

    #    print(f"Writing bdeck to {bdeck_path}")
    #    storm_bdeck.to_csv(bdeck_path, header=False, index=False)
        wrote_b = True
    #else:
    #    print(f"BDECK for {storm_name} is empty. Skipping")

    # filter out adeck data for given storm
    storm_a_match = adeck['STORMNAME'].apply(is_equal,
                                             args=(storm_name,))
    storm_adeck = adeck[storm_a_match]

    if not storm_adeck.empty:
        adeck_filename = f'a{file_prefix}{index_pad}.dat'
        adeck_path = os.path.join(adeck_base, adeck_filename)
        if debug:
            print(f"Writing adeck to {adeck_path}")
        storm_adeck.to_csv(adeck_path, header=False, index=False)
        wrote_a = True
    else:
        if debug:
            print(f"ADECK for {storm_name} is empty. Skipping")

    if wrote_a or wrote_b:
        index += 1

print("Finished processing all storms")

Running METplus

It is recommended to run this use case by:

Passing in TCPairs_extra_tropical.conf then a user-specific system configuration file:

run_metplus.py -c /path/to/CyclonePlotter_fcstGFS_obsGFS_OPC.conf -c /path/to/user_system.conf

The following METplus configuration variables must be set correctly to run this example.:

  • INPUT_BASE - Path to directory where EMC data files (csv) are read (See Datasets section to obtain tarballs).

  • OUTPUT_BASE - Path where METplus output will be written. This must be in a location where you have write permissions

  • MET_INSTALL_DIR - Path to location where MET is installed locally

Example User Configuration File:

[dir]
INPUT_BASE = /path/to/sample/input/data
OUTPUT_BASE = /path/to/output/dir
MET_INSTALL_DIR = /path/to/met-X.Y

NOTE: All of these items must be found under the [dir] section.

Expected Output

A successful run will output the following both to the screen and to the logfile:

INFO: METplus has successfully finished running.

Refer to the value set for OUTPUT_BASE to find where the output data was generated. Output for this use case will be found in tc_pairs/201412 (relative to OUTPUT_BASE) and will contain the following files:

  • decks/adeck/adeck.2020100700.xxxx.dat

  • tc_pairs/tc_pairs.2020100700.xxxx.tcst

  • cyclone/20201007.png

  • cyclone/20201007.txt

where “xxxx” is the unique four digit storm identifier for TCPairs wrapper to use.

Keywords

sphinx_gallery_thumbnail_path = ‘_static/tc_and_extra_tc-CyclonePlotter_fcstGFS_obsGFS_OPC.png’

Total running time of the script: ( 0 minutes 0.000 seconds)

Gallery generated by Sphinx-Gallery