CyclonePlotter: Extra-TC Tracker and Plotting Capabilities

model_applications/tc_and_extra_tc/CyclonePlotter_fcstGFS_obsGFS_UserScript_ExtraTC.conf

Scientific Objective

Once this method is complete, a user-created extra TC track file for the valid date of interest (YYYYMMDDHH) will have been created, paired up by TCPairs, and global storm tracks for the valid date of interest will be plotted by CyclonePlotter (PlateCaree projection)

Datasets

Forecast: Adeck
/path/to/{init?fmt=%Y}/trak.gfso.atcf_gen.glbl.{init?fmt=%Y}
Observation: Bdeck
/path/to/{init?fmt=%Y}/trak.gfso.atcf_gen.glbl.{init?fmt=%Y}
Location: All of the input data required for this use case can be found in the met_test sample data tarball. Click here to the METplus releases page and download sample data for the appropriate release: https://github.com/dtcenter/METplus/releases
The tarball should be unpacked into the directory that you will set the value of INPUT_BASE. See Running METplus section for more information.
Data Source: GFS

External Dependencies

You will need to use a version of Python 3.6+ that has the following packages installed:

  • cartopy

  • matplotlib

METplus Components

This use case utilizes Python user script-created output files that are accessible via the TCPairs wrapper. Due to the nature of the source file (already tracked extra TCs), the TCPairs wrapper is passed the “Adeck” file for each storm twice: once as the adeck or forecast file, and once as the bdeck or analysis file. Essentially, TCPairs is matching a forecast to itself. It then uses the CyclonePlotter wrapper to create a global plot of storm tracks for the desired day of interest (YYYYMMDDHH).

METplus Workflow

TCPairs is the first tool called in this example. It processes the following run times for each storm file:

Init/Valid: 2020100700

CyclonePlotter is the second (and final) tool called in this example. It processes the output from TCPairs.

METplus Configuration

METplus first loads all of the configuration files found in parm/metplus_config, then it loads any configuration files passed to METplus via the command line with the -c option, i.e. -c /path/to/TCPairs_extra_tropical.conf

[config]

# Documentation for this use case can be found at
# https://metplus.readthedocs.io/en/latest/generated/model_applications/tc_and_extra_tc/CyclonePlotter_fcstGFS_obsGFS_UserScript_ExtraTC.html

# For additional information, please see the METplus Users Guide.
# https://metplus.readthedocs.io/en/latest/Users_Guide

###
# Processes to run
# https://metplus.readthedocs.io/en/latest/Users_Guide/systemconfiguration.html#process-list
###

PROCESS_LIST = UserScript, TCPairs, CyclonePlotter


###
# Time Info
# LOOP_BY options are INIT, VALID, RETRO, and REALTIME
# If set to INIT or RETRO:
#   INIT_TIME_FMT, INIT_BEG, INIT_END, and INIT_INCREMENT must also be set
# If set to VALID or REALTIME:
#   VALID_TIME_FMT, VALID_BEG, VALID_END, and VALID_INCREMENT must also be set
# LEAD_SEQ is the list of forecast leads to process
# https://metplus.readthedocs.io/en/latest/Users_Guide/systemconfiguration.html#timing-control
###

LOOP_BY = INIT
INIT_TIME_FMT = %Y%m%d%H
INIT_BEG = 2020100700
INIT_END = 2020100700
INIT_INCREMENT = 21600

USER_SCRIPT_RUNTIME_FREQ = RUN_ONCE_PER_INIT_OR_VALID


###
# File I/O
# https://metplus.readthedocs.io/en/latest/Users_Guide/systemconfiguration.html#directory-and-filename-template-info
###

USER_SCRIPT_OUTPUT_DIR = {OUTPUT_BASE}/decks

TC_PAIRS_ADECK_INPUT_DIR = {USER_SCRIPT_OUTPUT_DIR}/adeck
TC_PAIRS_ADECK_TEMPLATE = adeck.{init?fmt=%Y%m%d%H}.{cyclone}.dat

TC_PAIRS_BDECK_INPUT_DIR = {USER_SCRIPT_OUTPUT_DIR}/adeck
TC_PAIRS_BDECK_TEMPLATE = adeck.{init?fmt=%Y%m%d%H}.{cyclone}.dat

TC_PAIRS_OUTPUT_DIR = {OUTPUT_BASE}/tc_pairs
TC_PAIRS_OUTPUT_TEMPLATE = tc_pairs.{init?fmt=%Y%m%d%H}.{cyclone}

CYCLONE_PLOTTER_INPUT_DIR = {TC_PAIRS_OUTPUT_DIR}
CYCLONE_PLOTTER_OUTPUT_DIR = {OUTPUT_BASE}/cyclone


###
# UserScript Settings
# https://metplus.readthedocs.io/en/latest/Users_Guide/wrappers.html#userscript
###

USER_SCRIPT_PATH = {PARM_BASE}/use_cases/model_applications/tc_and_extra_tc/CyclonePlotter_fcstGFS_obsGFS_UserScript_ExtraTC/extract_opc_decks.py

USER_SCRIPT_INPUT_PATH = {INPUT_BASE}/model_applications/tc_and_extra_tc/CyclonePlotter_fcstGFS_obsGFS_UserScript_ExtraTC/trak.gfso.atcf_gen.glbl.{init?fmt=%Y}

USER_SCRIPT_COMMAND = {USER_SCRIPT_PATH} {USER_SCRIPT_INPUT_PATH} {USER_SCRIPT_OUTPUT_DIR} {init?fmt=%Y%m%d%H}


###
# TCPairs Settings
# https://metplus.readthedocs.io/en/latest/Users_Guide/wrappers.html#tcpairs
###

TC_PAIRS_DLAND_FILE = MET_BASE/tc_data/dland_global_tenth_degree.nc

TC_PAIRS_MATCH_POINTS = FALSE


###
# CyclonePlotter Settings
# https://metplus.readthedocs.io/en/latest/Users_Guide/wrappers.html#cycloneplotter
###

CYCLONE_PLOTTER_INIT_DATE={init?fmt=%Y%m%d}
CYCLONE_PLOTTER_INIT_HR ={init?fmt=%H}
CYCLONE_PLOTTER_MODEL = GFSO
CYCLONE_PLOTTER_PLOT_TITLE = Model Forecast Storm Tracks

CYCLONE_PLOTTER_GLOBAL_PLOT = no

CYCLONE_PLOTTER_WEST_LON = -180
CYCLONE_PLOTTER_EAST_LON = 179
CYCLONE_PLOTTER_SOUTH_LAT = 0
CYCLONE_PLOTTER_NORTH_LAT = 90

CYCLONE_PLOTTER_CIRCLE_MARKER_SIZE = 4
CYCLONE_PLOTTER_CROSS_MARKER_SIZE = 6

CYCLONE_PLOTTER_ANNOTATION_FONT_SIZE = 3

CYCLONE_PLOTTER_LEGEND_FONT_SIZE = 3

CYCLONE_PLOTTER_RESOLUTION_DPI = 400

CYCLONE_PLOTTER_GENERATE_TRACK_ASCII = yes

CYCLONE_PLOTTER_ADD_WATERMARK = False

MET Configuration

METplus sets environment variables based on user settings in the METplus configuration file. See How METplus controls MET config file settings for more details.

YOU SHOULD NOT SET ANY OF THESE ENVIRONMENT VARIABLES YOURSELF! THEY WILL BE OVERWRITTEN BY METPLUS WHEN IT CALLS THE MET TOOLS!

If there is a setting in the MET configuration file that is currently not supported by METplus you’d like to control, please refer to: Overriding Unsupported MET config file settings

Note

See the TCPairs MET Configuration section of the User’s Guide for more information on the environment variables used in the file below:

////////////////////////////////////////////////////////////////////////////////
//
// Default TCPairs configuration file
//
////////////////////////////////////////////////////////////////////////////////

//
// ATCF file format reference:
//   http://www.nrlmry.navy.mil/atcf_web/docs/database/new/abrdeck.html
//

//
// Models
//
${METPLUS_MODEL}

//
// Description
//
${METPLUS_DESC}

//
// Storm identifiers
//
${METPLUS_STORM_ID}

//
// Basins
//
${METPLUS_BASIN}

//
// Cyclone numbers
//
${METPLUS_CYCLONE}

//
// Storm names
//
${METPLUS_STORM_NAME}

//
// Model initialization time windows to include or exclude
//
${METPLUS_INIT_BEG}
${METPLUS_INIT_END}
// init_inc =
${METPLUS_INIT_INC}
// init_exc =
${METPLUS_INIT_EXC}

// valid_inc =
${METPLUS_VALID_INC}
// valid_exc =
${METPLUS_VALID_EXC}

// write_valid =
${METPLUS_WRITE_VALID}

//
// Valid model time window
//
${METPLUS_VALID_BEG}
${METPLUS_VALID_END}

//
// Model initialization hours
//
init_hour = [];

//
// Required lead time in hours
//
lead_req = [];

//
// lat/lon polylines defining masking regions
//
init_mask  = "";
valid_mask = "";

//
// Specify if the code should check for duplicate ATCF lines
//
//check_dup =
${METPLUS_CHECK_DUP}


//
// Specify special processing to be performed for interpolated models.
// Set to NONE, FILL, or REPLACE.
//
//interp12 =
${METPLUS_INTERP12}

//
// Specify how consensus forecasts should be defined
//
//consensus =
${METPLUS_CONSENSUS_LIST}


//
// Forecast lag times
//
lag_time = [];

//
// CLIPER/SHIFOR baseline forecasts to be derived from the BEST
// and operational (CARQ) tracks.
//
best_technique = [ "BEST" ];
best_baseline  = [];
oper_technique = [ "CARQ" ];
oper_baseline  = [];

//
// Specify the datasets to be searched for analysis tracks (NONE, ADECK, BDECK,
// or BOTH).
//
anly_track = BDECK;

//
// Specify if only those track points common to both the ADECK and BDECK
// tracks be written out.
//
//match_points =
${METPLUS_MATCH_POINTS}

//
// Specify the NetCDF output of the gen_dland tool containing a gridded
// representation of the minimum distance to land.
//
${METPLUS_DLAND_FILE}

//
// Specify watch/warning information:
//   - Input watch/warning filename
//   - Watch/warning time offset in seconds
//
watch_warn = {
   file_name   = "MET_BASE/tc_data/wwpts_us.txt";
   time_offset = -14400;
}


//diag_info_map = {
${METPLUS_DIAG_INFO_MAP_LIST}

//diag_convert_map = {
${METPLUS_DIAG_CONVERT_MAP_LIST}

//
// Indicate a version number for the contents of this configuration file.
// The value should generally not be modified.
//
//version = "V9.0";

tmp_dir = "${MET_TMP_DIR}";

${METPLUS_MET_CONFIG_OVERRIDES}

Python Embedding

This use case uses a Python embedding script to read input data. Because the source file already contains “analysis” tracks for the extra TCs, this Python script only needs to output storm tracks that have a valid time matching the user input. These storms are put into separate storm files, to better mimic how TC storms are typically passed to TCPairs.

parm/use_cases/model_applications/tc_and_extra_tc/CyclonePlotter_fcstGFS_obsGFS_UserScript_ExtraTC/extract_opc_decks.py

#! /usr/bin/env python3

#
#  program extrack_opc_decks.py
#
#  reads in EMC 2020 cyclone data 
#  takes 3 command line arguments
#  1) input file (full path, eg, "/d2/projects/d2/projects/extra-tc_verif/gpfs/dell1/nco/ops/com/gentracks/prod/gentracks/{init?fmt=%Y}/trak.gfso.atcf_gen.glbl.{init?fmt=%Y}"
#  2) output directory (eg "{OUTPUT_BASE}/decks")
#  3) init time (YYYYMMDDHH)
#
#  reads all data in input file, creates ADECK using all points valid at init time (key 'YYYYMMDDHH', creates BDECK
#    using key ('STORMNAME') for all storms in ADECK where forecast key ('TAU') = '000' or 0 hrs
#  writes a single adeck and a single bdeck file containing all storms
#
#  further processed by TC_Pairs (extra-tropical) and CyclonePlotter in single use-case wrapper CyclonePlotter_fcst_GFS_obsGFS_OPC
#
#  written February 2021 by George McCabe (mccabe@ucar.edu)
#

import sys
import os
import pandas as pd

# column names/dictionary keys for the trak.data file
atcf_headers_trak=['BASIN','CYCLONE','STORMNAME','YYYYMMDDHH','TECHNUM/MIN','TECH','TAU','LAT','LON',
                   'VMAX','MSLP','TY','RAD','WINDCODE','RAD1','RAD2','RAD3','RAD4','POUTER',
                   'ROUTER','RMW','GUSTS','EYE','SUBREGION','MAXSEAS','INITIALS','DIR','SPEED','F1','F2',
                   'STORMNAME2','DEPTH','SEAS','SEASCODE','SEAS1','SEAS2','SEAS3','SEAS4']

# needs exactly 3 arguments (see above)
num_args = len(sys.argv) - 1

if num_args < 3:
    print("ERROR: Not enough arguments")
    sys.exit(1)
debug = 'debug' in sys.argv
# function to compare storm warning time to search time
def is_equal(column_val, search_string):
    return str(column_val).strip() == search_string

input_file = sys.argv[1]
output_dir = sys.argv[2]
search_date = sys.argv[3]

if debug:
    print(f"Running {__file__}\nSearch date: {search_date}")

# get 2 digit year to use in CYCLONE column substitute value
search_year = search_date[2:4]

# string to use in output file names for filtered adeck and bdeck files
file_prefix = f'deck.{search_date}.'

# an intermediate directory path for the separate files
adeck_base = os.path.join(output_dir, "adeck")
#bdeck_base = os.path.join(output_dir, "bdeck")

# create output directories if not already there
if not os.path.exists(adeck_base):
    print(f"Creating output directory: {adeck_base}")
    os.makedirs(adeck_base)

#if not os.path.exists(bdeck_base):
#    print(f"Creating output directory: {bdeck_base}")
#    os.makedirs(bdeck_base)

# using pandas (pd), read input file
print(f"Reading input file: {input_file}")
pd_data = pd.read_csv(input_file, names=atcf_headers_trak)

print(f"Filtering data...")

# get all 0 hour analyses data
print(f"Filtering data 0 (hr) in TAU (forecast hour) column for bdeck")
pd_0hr_data = pd_data[pd_data['TAU'] == 0]

# get adeck - all lines that match the desired date for YYYYMMDDHH (init time)
print(f"Filtering data with {search_date} in YYYYMMDDHH column for adeck")
init_matches = pd_data['YYYYMMDDHH'].apply(is_equal,
                                           args=(search_date,))
adeck = pd_data[init_matches]

# get list of STORMNAMEs from adeck data
all_storms = adeck.STORMNAME.unique()

# initialize counter to use to set output filenames with "cyclone" number
# to keep storms in separate files
index = 0

# loop over storms
for storm_name in all_storms:
    index_pad = str(index).zfill(4)

    # remove whitespace at beginning of storm name
    storm_name = storm_name.strip()

    # get 0hr data for given storm to use as bdeck
    storm_b_match = pd_0hr_data['STORMNAME'].apply(is_equal,
                                                   args=(storm_name,))
    storm_bdeck = pd_0hr_data[storm_b_match]
    if debug:
        print(f"Processing storm: {storm_name}")
    wrote_a = wrote_b = False

    #Logic for writing out Analysis files. Currently commented out,
    #but left in for possible future use
    if not storm_bdeck.empty:
    #    bdeck_filename = f'b{file_prefix}{index_pad}.dat'
    #    bdeck_path = os.path.join(bdeck_base, bdeck_filename)

    #    print(f"Writing bdeck to {bdeck_path}")
    #    storm_bdeck.to_csv(bdeck_path, header=False, index=False)
        wrote_b = True
    #else:
    #    print(f"BDECK for {storm_name} is empty. Skipping")

    # filter out adeck data for given storm
    storm_a_match = adeck['STORMNAME'].apply(is_equal,
                                             args=(storm_name,))
    storm_adeck = adeck[storm_a_match]

    if not storm_adeck.empty:
        adeck_filename = f'a{file_prefix}{index_pad}.dat'
        adeck_path = os.path.join(adeck_base, adeck_filename)
        if debug:
            print(f"Writing adeck to {adeck_path}")
        storm_adeck.to_csv(adeck_path, header=False, index=False)
        wrote_a = True
    else:
        if debug:
            print(f"ADECK for {storm_name} is empty. Skipping")

    if wrote_a or wrote_b:
        index += 1

print("Finished processing all storms")

Running METplus

It is recommended to run this use case by:

Passing in TCPairs_extra_tropical.conf then a user-specific system configuration file:

run_metplus.py -c /path/to/CyclonePlotter_fcstGFS_obsGFS_UserScript_ExtraTC.conf -c /path/to/user_system.conf

The following METplus configuration variables must be set correctly to run this example.:

  • INPUT_BASE - Path to directory where EMC data files (csv) are read (See Datasets section to obtain tarballs).

  • OUTPUT_BASE - Path where METplus output will be written. This must be in a location where you have write permissions

  • MET_INSTALL_DIR - Path to location where MET is installed locally

Example User Configuration File:

[dir]
INPUT_BASE = /path/to/sample/input/data
OUTPUT_BASE = /path/to/output/dir
MET_INSTALL_DIR = /path/to/met-X.Y

NOTE: All of these items must be found under the [dir] section.

Expected Output

A successful run will output the following both to the screen and to the logfile:

INFO: METplus has successfully finished running.

Refer to the value set for OUTPUT_BASE to find where the output data was generated. Output for this use case will be found in tc_pairs/201412 (relative to OUTPUT_BASE) and will contain the following files:

  • decks/adeck/adeck.2020100700.xxxx.dat

  • tc_pairs/tc_pairs.2020100700.xxxx.tcst

  • cyclone/20201007.png

  • cyclone/20201007.txt

where “xxxx” is the unique four digit storm identifier for TCPairs wrapper to use.

Keywords

Note

  • TCPairsToolUseCase

  • SBUOrgUseCase

  • CyclonePlotterUseCase

  • TropicalCycloneUseCase

Navigate to the METplus Quick Search for Use Cases page to discover other similar use cases.

sphinx_gallery_thumbnail_path = ‘_static/tc_and_extra_tc-CyclonePlotter_fcstGFS_obsGFS_UserScript_ExtraTC.png’

Total running time of the script: (0 minutes 0.000 seconds)

Gallery generated by Sphinx-Gallery