5.2.7.5. Multi_Tool: Feature Relative by Init

model_applications/medium_range/TCStat_SeriesAnalysis_fcstGFS _obsGFS_FeatureRelative _SeriesByInit.conf

Scientific Objective

By maintaining focus of each evaluation time (or evaluation time series, in this case) on a user-defined area around a cyclone, the model statistical errors associated with cyclonic physical features (moisture flux, stability, strength of upper-level PV anomaly and jet, etc.) can be related directly to the model forecasts and provide improvement guidance by accurately depicting interactions with significant weather features around and within the cyclone. This is in contrast to the traditional method of regional averaging cyclone observations in a fixed grid, which “smooths out” system features and limits the meaningful metrics that can be gathered.

Datasets

Relevant information about the datasets that would be beneficial include:

  • TC-Pairs/TC-Stat Forecast dataset: ADeck modified-ATCF tropical cyclone data

  • Series-Analysis Forecast dataset: GFS

  • TC-Pairs/TC-Stat Observation dataset: BDeck modified-ATCF tropical cyclone data

  • Series-Analysis Observation dataset: GFS Analysis

External Dependencies

You will need to use a version of Python 3.6+ that has the following packages installed:

* netCDF4

METplus Components

This use case first runs TCPairs and ExtractTiles to generate matched tropical cyclone data and regrid them into appropriately-sized tiles along a storm track. The MET tc-stat tool is used to filter the track data, and the MET regrid-dataplane tool is used to regrid the data (GRIB1 or GRIB2 into netCDF). Next, a series analysis by init time is performed on the results and plots (.ps and .png) are generated for all variable-level-stat combinations from the specified variables, levels, and requested statistics.

METplus Workflow

The following tools are used for each run time: TCPairs > RegridDataPlane, TCStat > SeriesAnalysis

This example loops by initialization time. For each initialization time it will process forecast leads 6, 12, 18, 24, 30, 36, and 40. There is only one initialization time in this example, so the following will be run:

Run times:

Init: 20141214_0Z
Forecast lead: 6

Init: 20141214_0Z
Forecast lead: 12

Init: 20141214_0Z
Forecast lead: 18

Init: 20141214_0Z
Forecast lead: 24

Init: 20141214_0Z
Forecast lead: 30

Init: 20141214_0Z
Forecast lead: 36

Init: 20141214_0Z
Forecast lead: 42

METplus Configuration

METplus first loads all of the configuration files found in parm/metplus_config, then it loads any configuration files passed to METplus via the command line with the -c option, i.e. -c parm/use_cases/model_applications/medium_range/TCStat_SeriesAnalysis_fcstGFS_obsGFS_FeatureRelative_SeriesByInit.conf

#
#  CONFIGURATION
#
[config]

# Loop over each process in the process list (set in PROCESS_LIST) for all times in the time window of
# interest.
LOOP_ORDER = processes
# Configuration files
TC_PAIRS_CONFIG_FILE = {PARM_BASE}/met_config/TCPairsConfig_wrapped

SERIES_ANALYSIS_CONFIG_FILE = {PARM_BASE}/met_config/SeriesAnalysisConfig_wrapped

PROCESS_LIST = TCPairs, TCStat, ExtractTiles, TCStat(for_series_analysis), SeriesAnalysis

# The init time begin and end times, increment
LOOP_BY = INIT
INIT_TIME_FMT = %Y%m%d
INIT_BEG = 20141214
INIT_END = 20141214

# This is the step-size. Increment in seconds from the begin time to the end
# time
INIT_INCREMENT = 21600 ;; set to every 6 hours=21600 seconds

#######
# TCPairs Configurations
#######

# A list of times to include, in format YYYYMMDD_hh
TC_PAIRS_INIT_INCLUDE =

# A list of times to exclude, in format YYYYMMDD_hh
TC_PAIRS_INIT_EXCLUDE =

# Specify model valid time window in format YYYYMM[DD[_hh]].  Only tracks
# that fall within the valid time window will be used.
TC_PAIRS_VALID_BEG =
TC_PAIRS_VALID_END =

#
# Run MET tc_pairs by indicating the top-level directories for the A-deck
# and B-deck files. Set to 'yes' to run using top-level directories, 'no'
# if you want to run tc_pairs on files paired by the wrapper.
TC_PAIRS_READ_ALL_FILES = no

# List of models to be used (white space or comma separated) eg: DSHP, LGEM, HWRF
# If no models are listed, then process all models in the input file(s).
MODEL = GFSO

# List of storm ids of interest (space or comma separated) e.g.: AL112012, AL122012
# If no storm ids are listed, then process all storm ids in the input file(s).
TC_PAIRS_STORM_ID =

# Basins (of origin/region).  Indicate with space or comma-separated list of regions, eg. AL: for North Atlantic,
# WP: Western North Pacific, CP: Central North Pacific, SH: Southern Hemisphere, IO: North Indian Ocean, LS: Southern
# Hemisphere
TC_PAIRS_BASIN =

# Cyclone, a space or comma-separated list of cyclone numbers. If left empty, all cyclones will be used.
TC_PAIRS_CYCLONE =

# Storm name, a space or comma-separated list of storm names to evaluate.  If left empty, all storms will be used.
TC_PAIRS_STORM_NAME =

# DLAND file, the full path of the file that contains the gridded representation of the
# minimum distance from land.
TC_PAIRS_DLAND_FILE = {MET_INSTALL_DIR}/share/met/tc_data/dland_global_tenth_degree.nc

TC_PAIRS_REFORMAT_DECK = yes
TC_PAIRS_REFORMAT_TYPE = SBU

TC_PAIRS_MISSING_VAL_TO_REPLACE = -99
TC_PAIRS_MISSING_VAL = -9999

# overwrite modified track data (non-ATCF to ATCF format)
TC_PAIRS_SKIP_IF_REFORMAT_EXISTS = yes

# overwrite tc_pairs output
TC_PAIRS_SKIP_IF_OUTPUT_EXISTS = yes


#######
# TCStat Configurations
#######
TC_STAT_JOB_ARGS = -job filter -basin ML -dump_row {TC_STAT_OUTPUT_DIR}/{TC_STAT_OUTPUT_TEMPLATE}

# Specify whether only those track points common to both the ADECK and BDECK
# tracks should be written out.  This is only used when explicitly calling
# TC_STAT in the PROCESS_LIST.  This is not used in this use case, so setting
# it to either false or true has no impact.
TC_STAT_MATCH_POINTS = true

# These all map to the options in the default TC-Stat config file, except these
# are pre-pended with TC_STAT to avoid clashing with any other similarly
# named options from other MET tools (eg TC_STAT_AMODEL corresponds to the
# amodel option in the default MET tc-stat config file, whereas AMODEL
# corresponds to the amodel option in the MET tc-pairs config file).

# Stratify by these columns:
TC_STAT_AMODEL =
TC_STAT_BMODEL =
TC_STAT_DESC =
TC_STAT_STORM_ID =
TC_STAT_BASIN =
TC_STAT_CYCLONE =
TC_STAT_STORM_NAME =

# Stratify by init times via a comma-separate list of init times to
# include or exclude.  Time format defined as YYYYMMDD_HH or YYYYMMDD_HHmmss
TC_STAT_INIT_BEG =
TC_STAT_INIT_END =
TC_STAT_INIT_INCLUDE = {init?fmt=%Y%m%d_%H}
TC_STAT_INIT_EXCLUDE =
TC_STAT_INIT_HOUR =
# Stratify by valid times via a comma-separate list of valid times to
# include or exclude.  Time format defined as YYYYMMDD_HH or YYYYMMDD_HHmmss
TC_STAT_VALID_BEG =
TC_STAT_VALID_END =
TC_STAT_VALID_INCLUDE =
TC_STAT_VALID_EXCLUDE =
TC_STAT_VALID_HOUR =
TC_STAT_LEAD_REQ =
TC_STAT_INIT_MASK =
TC_STAT_VALID_MASK =
# Stratify by the valid time and lead time via comma-separated list of
# times in format HH[MMSS]
TC_STAT_VALID_HOUR =
TC_STAT_LEAD =

# Stratify over the watch_warn column in the tcst file.  Setting this to
# 'ALL' will match HUWARN, HUWATCH, TSWARN, TSWATCH
TC_STAT_TRACK_WATCH_WARN =

# Stratify by applying thresholds to numeric data columns.  Specify with
# comma-separated list of column names and thresholds to be applied.
# The length of TC_STAT_COLUMN_THRESH_NAME should be the same as
# TC_STAT_COLUMN_THRESH_VAL.
TC_STAT_COLUMN_THRESH_NAME =
TC_STAT_COLUMN_THRESH_VAL =

# Stratify by a list of comma-separated columns names and values corresponding
# to non-numeric data columns of the values of interest.
TC_STAT_COLUMN_STR_NAME =
TC_STAT_COLUMN_STR_VAL =

# Stratify by applying thresholds to numeric data columns only when lead=0.
# If lead=0 and the value does not meet the threshold, discard the entire
# track.  The length of TC_STAT_INIT_THRESH_NAME must equal the length of
# TC_STAT_INIT_THRESH_VAL.
TC_STAT_INIT_THRESH_NAME =
TC_STAT_INIT_THRESH_VAL =

# Stratify by applying thresholds to numeric data columns only when lead = 0.
# If lead = 0 but the value doesn't meet the threshold, discard the entire
# track.
TC_STAT_INIT_STR_NAME =
TC_STAT_INIT_STR_VAL =

# Excludes any points where distance to land is <=0. When set to TRUE, once land
# is encountered, the remainder of the forecast track is NOT used for the
# verification, even if the track moves back over water.
TC_STAT_WATER_ONLY =

# TRUE or FALSE.  To specify whether only those track points occurring near
# landfall should be retained. Landfall is the last bmodel track point before
# the distance to land switches from water to land.
TC_STAT_LANDFALL =

# Define the landfall retention window, which is defined as the hours offset
# from the time of landfall. Format is in HH[MMSS]. Default TC_STAT_LANDFALL_BEG
# is set to -24, and TC_STAT_LANDFALL_END is set to 00
#TC_STAT_LANDFALL_BEG = -24
#TC_STAT_LANDFALL_END = 00
TC_STAT_LANDFALL_BEG =
TC_STAT_LANDFALL_END =

# OVERWRITE OPTIONS
# Skip writing filter files if they already exist.
# Set to yes if you want to skip processing existing files
# Set to no if you want to override existing files
EXTRACT_TILES_SKIP_IF_OUTPUT_EXISTS = yes

# Constants used in creating the tile grid, used by extract tiles
EXTRACT_TILES_NLAT = 60
EXTRACT_TILES_NLON = 60

# Resolution of data in degrees, used by extract tiles
EXTRACT_TILES_DLAT = 0.5
EXTRACT_TILES_DLON = 0.5

# Degrees to subtract from the center lat and lon to
# calculate the lower left lat (lat_ll) and lower
# left lon (lon_ll) for a grid that is 2n X 2m,
# where n = EXTRACT_TILES_LAT_ADJ degrees and m = EXTRACT_TILES_LON_ADJ degrees.
# For this case, where n=15 and m=15, this results
# in a 30 deg X 30 deg grid.  Used by extract tiles
EXTRACT_TILES_LON_ADJ = 15
EXTRACT_TILES_LAT_ADJ = 15

# Settings specific to the TCStat(for_series_analysis) process that was set
# in the PROCESS_LIST. Any TC_STAT_* variable not set in this section will use
# the value set outside of this section
[for_series_analysis]
TC_STAT_JOB_ARGS = -job filter -init_beg {INIT_BEG} -init_end {INIT_END} -dump_row {TC_STAT_OUTPUT_DIR}/{TC_STAT_OUTPUT_TEMPLATE}

TC_STAT_OUTPUT_DIR = {SERIES_ANALYSIS_OUTPUT_DIR}
TC_STAT_LOOKIN_DIR = {EXTRACT_TILES_OUTPUT_DIR}

[config]

# Used by extract tiles and series analysis to define the records of
# interest to be retrieved from the grib2 file
#
BOTH_VAR1_NAME = TMP
BOTH_VAR1_LEVELS = Z2

SERIES_ANALYSIS_RUNTIME_FREQ = RUN_ONCE_PER_INIT_OR_VALID

SERIES_ANALYSIS_RUN_ONCE_PER_STORM_ID = True

# PLOTTING Relevant to series analysis plots.
# By default, background map is turned off. Set
# to no to turn of plotting of background map.
SERIES_ANALYSIS_BACKGROUND_MAP = no

# set the regrid dictionary item to_grid in the SeriesAnalysis MET config file
SERIES_ANALYSIS_REGRID_TO_GRID = FCST
SERIES_ANALYSIS_REGRID_METHOD = FORCE
#SERIES_ANALYSIS_REGRID_WIDTH =
#SERIES_ANALYSIS_REGRID_VLD_THRESH =
#SERIES_ANALYSIS_REGRID_SHAPE =

## NOTE: "TOTAL" is a REQUIRED cnt statistic used by the series analysis scripts
SERIES_ANALYSIS_STAT_LIST = TOTAL, FBAR, OBAR, ME

SERIES_ANALYSIS_BLOCK_SIZE = 4000

# set to True to add the -paired flag to the SeriesAnalysis command
SERIES_ANALYSIS_IS_PAIRED = True

# If True/yes, run plot_data_plane on output from Series-Analysis to generate
# images for each stat item listed in SERIES_ANALYSIS_STAT_LIST
SERIES_ANALYSIS_GENERATE_PLOTS = yes

# If True/yes, run convert on output from Series-Analysis to generate
# a gif using images in groups of name/level/stat
SERIES_ANALYSIS_GENERATE_ANIMATIONS = no

# Title to use when plotting output from Series-Analysis
# Only used if SERIES_ANALYSIS_GENERATE_PLOTS is True/yes
PLOT_DATA_PLANE_TITLE = {MODEL} Init {init?fmt=%Y%m%d_%H} Storm {storm_id} {num_leads} Forecasts (F{fcst_beg} to F{fcst_end}) {stat} for {fcst_name}, {fcst_level}


#
#  FILENAME TEMPLATES
#
[filename_templates]
# Define the format of the filenames
TC_PAIRS_ADECK_TEMPLATE = {date?fmt=%Y%m}/a{basin?fmt=%s}q{date?fmt=%Y%m}*.gfso.{cyclone?fmt=%s}
TC_PAIRS_BDECK_TEMPLATE = {date?fmt=%Y%m}/b{basin?fmt=%s}q{date?fmt=%Y%m}*.gfso.{cyclone?fmt=%s}
TC_PAIRS_OUTPUT_TEMPLATE = {date?fmt=%Y%m}/{basin?fmt=%s}q{date?fmt=%Y%m%d%H}.gfso.{cyclone?fmt=%s}

TC_STAT_OUTPUT_TEMPLATE = {init?fmt=%Y%m%d_%H}/filter_{init?fmt=%Y%m%d_%H}.tcst

EXTRACT_TILES_TC_STAT_INPUT_TEMPLATE = {TC_STAT_OUTPUT_TEMPLATE}

FCST_EXTRACT_TILES_INPUT_TEMPLATE = {init?fmt=%Y%m%d}/gfs_4_{init?fmt=%Y%m%d}_{init?fmt=%H}00_{lead?fmt=%3H}.grb2
OBS_EXTRACT_TILES_INPUT_TEMPLATE = {valid?fmt=%Y%m%d}/gfs_4_{valid?fmt=%Y%m%d}_{valid?fmt=%H}00_000.grb2

FCST_EXTRACT_TILES_OUTPUT_TEMPLATE = {init?fmt=%Y%m%d_%H}/{storm_id}/FCST_TILE_F{lead?fmt=%3H}_gfs_4_{init?fmt=%Y%m%d}_{init?fmt=%H}00_{lead?fmt=%3H}.nc
OBS_EXTRACT_TILES_OUTPUT_TEMPLATE = {init?fmt=%Y%m%d_%H}/{storm_id}/OBS_TILE_F{lead?fmt=%3H}_gfs_4_{init?fmt=%Y%m%d}_{init?fmt=%H}00_{lead?fmt=%3H}.nc

FCST_SERIES_ANALYSIS_INPUT_TEMPLATE = {FCST_EXTRACT_TILES_OUTPUT_TEMPLATE}
OBS_SERIES_ANALYSIS_INPUT_TEMPLATE = {OBS_EXTRACT_TILES_OUTPUT_TEMPLATE}

SERIES_ANALYSIS_TC_STAT_INPUT_TEMPLATE = {TC_STAT_OUTPUT_TEMPLATE}

# Template to look for climatology mean input to SeriesAnalysis relative to SERIES_ANALYSIS_CLIMO_MEAN_INPUT_DIR
# Not used in this example
SERIES_ANALYSIS_CLIMO_MEAN_INPUT_TEMPLATE =

# Template to look for climatology standard deviation input to SeriesAnalysis relative to SERIES_ANALYSIS_CLIMO_STDEV_INPUT_DIR
# Not used in this example
SERIES_ANALYSIS_CLIMO_STDEV_INPUT_TEMPLATE =

SERIES_ANALYSIS_OUTPUT_TEMPLATE = {init?fmt=%Y%m%d_%H}/{storm_id}/series_{fcst_name}_{fcst_level}.nc
#
#  DIRECTORIES
#
[dir]
TC_PAIRS_ADECK_INPUT_DIR = {INPUT_BASE}/model_applications/medium_range/track_data
TC_PAIRS_BDECK_INPUT_DIR = {TC_PAIRS_ADECK_INPUT_DIR}
TC_PAIRS_REFORMAT_DIR = {OUTPUT_BASE}/track_data_atcf
TC_PAIRS_OUTPUT_DIR = {OUTPUT_BASE}/tc_pairs

TC_STAT_LOOKIN_DIR = {TC_PAIRS_OUTPUT_DIR}
TC_STAT_OUTPUT_DIR = {EXTRACT_TILES_OUTPUT_DIR}

EXTRACT_TILES_TC_STAT_INPUT_DIR = {TC_STAT_OUTPUT_DIR}

FCST_EXTRACT_TILES_INPUT_DIR = {INPUT_BASE}/model_applications/medium_range/reduced_model_data
OBS_EXTRACT_TILES_INPUT_DIR = {INPUT_BASE}/model_applications/medium_range/reduced_model_data
EXTRACT_TILES_OUTPUT_DIR = {OUTPUT_BASE}/extract_tiles

FCST_SERIES_ANALYSIS_INPUT_DIR = {EXTRACT_TILES_OUTPUT_DIR}
OBS_SERIES_ANALYSIS_INPUT_DIR = {EXTRACT_TILES_OUTPUT_DIR}
SERIES_ANALYSIS_TC_STAT_INPUT_DIR = {SERIES_ANALYSIS_OUTPUT_DIR}

# directory containing climatology mean input to SeriesAnalysis
# Not used in this example
SERIES_ANALYSIS_CLIMO_MEAN_INPUT_DIR =

# directory containing climatology standard deviation input to SeriesAnalysis
# Not used in this example
SERIES_ANALYSIS_CLIMO_STDEV_INPUT_DIR =

SERIES_ANALYSIS_OUTPUT_DIR = {OUTPUT_BASE}/series_analysis_init

MET Configuration

METplus sets environment variables based on user settings in the METplus configuration file. See How METplus controls MET config file settings for more details.

YOU SHOULD NOT SET ANY OF THESE ENVIRONMENT VARIABLES YOURSELF! THEY WILL BE OVERWRITTEN BY METPLUS WHEN IT CALLS THE MET TOOLS!

If there is a setting in the MET configuration file that is currently not supported by METplus you’d like to control, please refer to: Overriding Unsupported MET config file settings

TCPairsConfig_wrapped

Note

See the TCPairs MET Configuration section of the User’s Guide for more information on the environment variables used in the file below:

////////////////////////////////////////////////////////////////////////////////
//
// Default TCPairs configuration file
//
////////////////////////////////////////////////////////////////////////////////

//
// ATCF file format reference:
//   http://www.nrlmry.navy.mil/atcf_web/docs/database/new/abrdeck.html
//

//
// Models
//
${METPLUS_MODEL}

//
// Description
//
${METPLUS_DESC}

//
// Storm identifiers
//
${METPLUS_STORM_ID}

//
// Basins
//
${METPLUS_BASIN}

//
// Cyclone numbers
//
${METPLUS_CYCLONE}

//
// Storm names
//
${METPLUS_STORM_NAME}

//
// Model initialization time windows to include or exclude
//
${METPLUS_INIT_BEG}
${METPLUS_INIT_END}
${METPLUS_INIT_INCLUDE}
${METPLUS_INIT_EXCLUDE}

//
// Valid model time window
//
${METPLUS_VALID_BEG}
${METPLUS_VALID_END}

//
// Model initialization hours
//
init_hour = [];

//
// Required lead time in hours
//
lead_req = [];

//
// lat/lon polylines defining masking regions
//
init_mask  = "";
valid_mask = "";

//
// Specify if the code should check for duplicate ATCF lines
//
check_dup = FALSE;

//
// Specify special processing to be performed for interpolated models.
// Set to NONE, FILL, or REPLACE.
//
interp12 = REPLACE;

//
// Specify how consensus forecasts should be defined
//
consensus = [];

//
// Forecast lag times
//
lag_time = [];

//
// CLIPER/SHIFOR baseline forecasts to be derived from the BEST
// and operational (CARQ) tracks.
//
best_technique = [ "BEST" ];
best_baseline  = [];
oper_technique = [ "CARQ" ];
oper_baseline  = [];

//
// Specify the datasets to be searched for analysis tracks (NONE, ADECK, BDECK,
// or BOTH).
//
anly_track = BDECK;

//
// Specify if only those track points common to both the ADECK and BDECK
// tracks be written out.
//
match_points = TRUE;

//
// Specify the NetCDF output of the gen_dland tool containing a gridded
// representation of the minimum distance to land.
//
${METPLUS_DLAND_FILE}

//
// Specify watch/warning information:
//   - Input watch/warning filename
//   - Watch/warning time offset in seconds
//
watch_warn = {
   file_name   = "MET_BASE/tc_data/wwpts_us.txt";
   time_offset = -14400;
}

//
// Indicate a version number for the contents of this configuration file.
// The value should generally not be modified.
//
//version = "V9.0";

${METPLUS_MET_CONFIG_OVERRIDES}

TCStatConfig_wrapped

Note

See the TCStat MET Configuration section of the User’s Guide for more information on the environment variables used in the file below:

///////////////////////////////////////////////////////////////////////////////
//
// Default TCStat configuration file
//
////////////////////////////////////////////////////////////////////////////////

//
// The parameters listed below are used to filter the TC-STAT data down to the
// desired subset of lines over which statistics are to be computed.  Only
// those lines which meet ALL of the criteria specified will be retained.
//
// The settings that are common to all jobs may be specified once at the top
// level.  If no selection is listed for a parameter, that parameter will not
// be used for filtering.  If multiple selections are listed for a parameter,
// the analyses will be performed on their union.
//

//
// Stratify by the AMODEL or BMODEL columns.
//
${METPLUS_AMODEL}
${METPLUS_BMODEL}

//
// Stratify by the DESC column.
//
${METPLUS_DESC}

//
// Stratify by the STORM_ID column.
//
${METPLUS_STORM_ID}

//
// Stratify by the BASIN column.
// May add using the "-basin" job command option.
//
${METPLUS_BASIN}

//
// Stratify by the CYCLONE column.
// May add using the "-cyclone" job command option.
//
${METPLUS_CYCLONE}

//
// Stratify by the STORM_NAME column.
// May add using the "-storm_name" job command option.
//
${METPLUS_STORM_NAME}

//
// Stratify by the INIT times.
// Model initialization time windows to include or exclude
// May modify using the "-init_beg", "-init_end", "-init_inc",
// and "-init_exc" job command options.
//
${METPLUS_INIT_BEG}
${METPLUS_INIT_END}
${METPLUS_INIT_INCLUDE}
${METPLUS_INIT_EXCLUDE}

//
// Stratify by the VALID times.
//
${METPLUS_VALID_BEG}
${METPLUS_VALID_END}
${METPLUS_VALID_INCLUDE}
${METPLUS_VALID_EXCLUDE}

//
// Stratify by the initialization and valid hours and lead time.
//
${METPLUS_INIT_HOUR}
${METPLUS_VALID_HOUR}
${METPLUS_LEAD}

//
// Select tracks which contain all required lead times.
//
${METPLUS_LEAD_REQ}

//
// Stratify by the INIT_MASK and VALID_MASK columns.
//
${METPLUS_INIT_MASK}
${METPLUS_VALID_MASK}

//
// Stratify by checking the watch/warning status for each track point
// common to both the ADECK and BDECK tracks.  If the watch/warning status
// of any of the track points appears in the list, retain the entire track.
//
${METPLUS_TRACK_WATCH_WARN}

//
// Stratify by applying thresholds to numeric data columns.
//
${METPLUS_COLUMN_THRESH_NAME}
${METPLUS_COLUMN_THRESH_VAL}

//
// Stratify by performing string matching on non-numeric data columns.
//
${METPLUS_COLUMN_STR_NAME}
${METPLUS_COLUMN_STR_VAL}

//
// Stratify by excluding strings in non-numeric data columns.
//
//column_str_exc_name =
${METPLUS_COLUMN_STR_EXC_NAME}

//column_str_exc_val =
${METPLUS_COLUMN_STR_EXC_VAL}

//
// Similar to the column_thresh options above
//
${METPLUS_INIT_THRESH_NAME}
${METPLUS_INIT_THRESH_VAL}

//
// Similar to the column_str options above
//
${METPLUS_INIT_STR_NAME}
${METPLUS_INIT_STR_VAL}

//
// Similar to the column_str_exc options above
//
//init_str_exc_name =
${METPLUS_INIT_STR_EXC_NAME}

//init_str_exc_val =
${METPLUS_INIT_STR_EXC_VAL}

//
// Stratify by the ADECK and BDECK distances to land.
//
${METPLUS_WATER_ONLY}

//
// Specify whether only those track points occurring near landfall should be
// retained, and define the landfall retention window in HH[MMSS] format
// around the landfall time.
//
${METPLUS_LANDFALL}
${METPLUS_LANDFALL_BEG}
${METPLUS_LANDFALL_END}

//
// Specify whether only those track points common to both the ADECK and BDECK
// tracks should be retained.  May modify using the "-match_points" job command
// option.
//
${METPLUS_MATCH_POINTS}

//
// Array of TCStat analysis jobs to be performed on the filtered data
//
${METPLUS_JOBS}

${METPLUS_MET_CONFIG_OVERRIDES}

SeriesAnalysisConfig_wrapped

Note

See the SeriesAnalysis MET Configuration section of the User’s Guide for more information on the environment variables used in the file below:

////////////////////////////////////////////////////////////////////////////////
//
// Series-Analysis configuration file.
//
// For additional information, see the MET_BASE/config/README file.
//
////////////////////////////////////////////////////////////////////////////////

//
// Output model name to be written
//
${METPLUS_MODEL}

//
// Output description to be written
//
${METPLUS_DESC}

//
// Output observation type to be written
//
${METPLUS_OBTYPE}

////////////////////////////////////////////////////////////////////////////////

//
// Verification grid
// May be set separately in each "field" entry
//
${METPLUS_REGRID_DICT}

////////////////////////////////////////////////////////////////////////////////

censor_thresh = [];
censor_val    = [];
${METPLUS_CAT_THRESH}
cnt_thresh    = [ NA ];
cnt_logic     = UNION;

//
// Forecast and observation fields to be verified
//
fcst = {
   ${METPLUS_FCST_FILE_TYPE}
   ${METPLUS_FCST_FIELD}
}
obs = {
   ${METPLUS_OBS_FILE_TYPE}
   ${METPLUS_OBS_FIELD}
}

////////////////////////////////////////////////////////////////////////////////

//
// Climatology data
//
//climo_mean = {
${METPLUS_CLIMO_MEAN_DICT}


//climo_stdev = {
${METPLUS_CLIMO_STDEV_DICT}

////////////////////////////////////////////////////////////////////////////////

//
// Confidence interval settings
//
ci_alpha  = [ 0.05 ];

boot = {
   interval = PCTILE;
   rep_prop = 1.0;
   n_rep    = 0;
   rng      = "mt19937";
   seed     = "";
}

////////////////////////////////////////////////////////////////////////////////

//
// Verification masking regions
//
mask = {
   grid = "";
   poly = "";
}

//
// Number of grid points to be processed concurrently.  Set smaller to use
// less memory but increase the number of passes through the data.
//
${METPLUS_BLOCK_SIZE}

//
// Ratio of valid matched pairs to compute statistics for a grid point
//
${METPLUS_VLD_THRESH}

////////////////////////////////////////////////////////////////////////////////

//
// Statistical output types
//
output_stats = {
   fho    = [];
   ctc    = [];
   ${METPLUS_CTS_LIST}
   mctc   = [];
   mcts   = [];
   ${METPLUS_STAT_LIST}
   sl1l2  = [];
   sal1l2 = [];
   pct    = [];
   pstd   = [];
   pjc    = [];
   prc    = [];
}

////////////////////////////////////////////////////////////////////////////////

rank_corr_flag = FALSE;
tmp_dir        = "/tmp";
//version        = "V10.0";

////////////////////////////////////////////////////////////////////////////////

${METPLUS_MET_CONFIG_OVERRIDES}

Running METplus

This use case can be run two ways:

  1. Passing in TCStat_SeriesAnalysis_fcstGFS_obsGFS_FeatureRelative_SeriesByInit.conf then a user-specific system configuration file:

    run_metplus.py -c /path/to/METplus/parm/use_cases/model_applications/medium_range/TCStat_SeriesAnalysis_fcstGFS_obsGFS_FeatureRelative_SeriesByInit.conf -c /path/to/user_system.conf
    
  2. Modifying the configurations in parm/metplus_config, then passing in TCStat_SeriesAnalysis_fcstGFS_obsGFS_FeatureRelative_SeriesByInit.conf:

    run_metplus.py -c /path/to/METplus/parm/use_cases/model_applications/medium_range/TCStat_SeriesAnalysis_fcstGFS_obsGFS_FeatureRelative_SeriesByInit.conf
    

The former method is recommended. Whether you add them to a user-specific configuration file or modify the metplus_config files, the following variables must be set correctly:

  • INPUT_BASE - Path to directory where sample data tarballs are unpacked (See Datasets section to obtain tarballs). This is not required to run METplus, but it is required to run the examples in parm/use_cases

  • OUTPUT_BASE - Path where METplus output will be written. This must be in a location where you have write permissions

  • MET_INSTALL_DIR - Path to location where MET is installed locally

and for the [exe] section, you will need to define the location of NON-MET executables. If the executable is in the user’s path, METplus will find it from the name. If the executable is not in the path, specify the full path to the executable here (i.e. CONVERT = /usr/bin/convert) The following executables are required for performing series analysis use cases:

If the executables are in the path:

  • CONVERT = convert

NOTE: All of these executable items must be located under the [exe] section.

If the executables are not in the path, they need to be defined:

  • CONVERT = /path/to/convert

NOTE: All of these executable items must be located under the [exe] section. Example User Configuration File:

[dir]
INPUT_BASE = /path/to/sample/input/data
OUTPUT_BASE = /path/to/output/dir
MET_INSTALL_DIR = /path/to/met-X.Y

[exe]
CONVERT = /path/to/convert

NOTE: The INPUT_BASE, OUTPUT_BASE, and MET_INSTALL_DIR must be located under the [dir] section, while the RM, CUT, TR, NCAP2, CONVERT, and NCDUMP must be located under the [exe] section.

Expected Output

A successful run will output the following both to the screen and to the logfile:

INFO: METplus has successfully finished running.

Refer to the value set for OUTPUT_BASE to find where the output data was generated. Output for this use case will be found in series_analysis_init/20141214_00 (relative to OUTPUT_BASE) and will contain the following subdirectories:

  • ML1200942014

  • ML1200942014

  • ML1200942014

  • ML1201002014

  • ML1201032014

  • ML1201042014

  • ML1201052014

  • ML1201062014

  • ML1201072014

  • ML1201082014

  • ML1201092014

  • ML1201102014

Each subdirectory will contain files that have the following format:

ANLY_ASCII_FILES_<storm>

FCST_ASCII_FILES_<storm>

series_<varname>_<level>_<stat>.png

series_<varname>_<level>_<stat>.ps

series_<varname>_<level>_<stat>.nc

Keywords

sphinx_gallery_thumbnail_path = ‘_static/medium_range-TCStat_SeriesAnalysis_fcstGFS_obsGFS_FeatureRelative_SeriesByInit.png’

Total running time of the script: ( 0 minutes 0.000 seconds)

Gallery generated by Sphinx-Gallery