5.2.9.1. Blocking Calculation: RegridDataPlane, PcpCombine, and Blocking python code

model_applications/ s2s/ UserScript_obsERA_obsOnly_Blocking.py

Scientific Objective

To compute the Central Blocking Latitude, Instantaneousy blocked latitudes, Group Instantaneousy blocked latitudes, and the frequency of atmospheric blocking using the Pelly-Hoskins Method.

Datasets

  • Forecast dataset: None

  • Observation dataset: ERA Reanlaysis 500 mb height.

External Dependencies

You will need to use a version of Python 3.6+ that has the following packages installed:

* numpy
* netCDF4
* datetime
* bisect
* scipy

If the version of Python used to compile MET did not have these libraries at the time of compilation, you will need to add these packages or create a new Python environment with these packages.

If this is the case, you will need to set the MET_PYTHON_EXE environment variable to the path of the version of Python you want to use. If you want this version of Python to only apply to this use case, set it in the [user_env_vars] section of a METplus configuration file.:

[user_env_vars] MET_PYTHON_EXE = /path/to/python/with/required/packages/bin/python

METplus Components

This use case runs the blocking driver script which runs the steps the user lists in STEPS_OBS. The possible steps are regridding, time averaging, computing a running mean, computing anomalies, computing CBLs (CBL), plotting CBLs (PLOTCBL), computing IBLs (IBL), plotting IBL frequency (PLOTIBL), computing GIBLs (GIBL), computing blocks (CALCBLOCKS), and plotting the blocking frequency (PLOTBLOCKS). Regridding, time averaging, running means, and anomaloies are set up in the UserScript .conf file and are formatted as follows: PROCESS_LIST = RegridDataPlane(regrid_obs), PcpCombine(daily_mean_obs), PcpCombine(running_mean_obs), PcpCombine(anomaly_obs), UserScript(script_blocking)

The other steps are listed in the Blocking .conf file and are formatted as follows: OBS_STEPS = CBL+PLOTCBL+IBL+PLOTIBL+GIBL+CALCBLOCKS+PLOTBLOCKS

METplus Workflow

The blocking python code is run for each time for the forecast and observations data. This example loops by valid time. This version is set to only process the blocking steps (CBL, PLOTCBL, IBL, PLOTIBL, GIBL, CALCBLOCKS, PLOTBLOCKS), omitting the regridding, time averaging, running mean, and anomaly pre-processing steps. However, the configurations for pre-processing are available for user reference.

METplus Configuration

METplus first loads all of the configuration files found in parm/metplus_config, then it loads any configuration files passed to METplus via the command line i.e. parm/use_cases/model_applications/s2s/UserScript_obsERA_obsOnly_Blocking.py. The file UserScript_obsERA_obsOnly_Blocking.conf runs the python program, however UserScript_obsERA_obsOnly_Blocking/Regrid_PCP_obsERA_obsOnly_Blocking.conf sets the variables for all steps of the Blocking use case.

# UserScript wrapper example

[config]
# All steps, including pre-processing:
# PROCESS_LIST = RegridDataPlane(regrid_obs), PcpCombine(daily_mean_obs), PcpCombine(running_mean_obs), PcpCombine(anomaly_obs), UserScript(script_blocking)
# Only Blocking Analysis script for the observations
PROCESS_LIST = UserScript(script_blocking)

# time looping - options are INIT, VALID, RETRO, and REALTIME
# If set to INIT or RETRO:
#   INIT_TIME_FMT, INIT_BEG, INIT_END, and INIT_INCREMENT must also be set
# If set to VALID or REALTIME:
#   VALID_TIME_FMT, VALID_BEG, VALID_END, and VALID_INCREMENT must also be set
LOOP_BY = VALID

# Format of VALID_BEG and VALID_END using % items
# %Y = 4 digit year, %m = 2 digit month, %d = 2 digit day, etc.
# see www.strftime.org for more information
# %Y%m%d%H expands to YYYYMMDDHH
VALID_TIME_FMT = %Y%m%d%H

# Start time for METplus run
VALID_BEG = 1979120100

# End time for METplus run
VALID_END = 2017022800

# Increment between METplus runs in seconds. Must be >= 60
VALID_INCREMENT = 86400

# List of forecast leads to process for each run time (init or valid)
# In hours if units are not specified
# If unset, defaults to 0 (don't loop through forecast leads)
LEAD_SEQ = 0

# Only Process DJF
SKIP_TIMES = "%m:begin_end_incr(3,11,1)", "%m%d:0229"

# Order of loops to process data - Options are times, processes
# Not relevant if only one item is in the PROCESS_LIST
# times = run all wrappers in the PROCESS_LIST for a single run time, then
#   increment the run time and run all wrappers again until all times have
#   been evaluated.
# processes = run the first wrapper in the PROCESS_LIST for all times
#   specified, then repeat for the next item in the PROCESS_LIST until all
#   wrappers have been run
LOOP_ORDER = processes

# location of configuration files used by MET applications
CONFIG_DIR={PARM_BASE}/use_cases/model_applications/s2s

# Run the obs data
# A variable set to be used in the pre-processing steps
OBS_RUN = True


# Regrid the observations to 1 degree using regrid_data_plane
[regrid_obs]
# End time for METplus run
VALID_END = 2017022818

# Increment between METplus runs in seconds. Must be >= 60
VALID_INCREMENT = 21600

# Run regrid_data_plane on forecast data
OBS_REGRID_DATA_PLANE_RUN = {OBS_RUN}

# If true, process each field individually and write a file for each
# If false, run once per run time passing in all fields specified
OBS_DATA_PLANE_ONCE_PER_FIELD = False

# Name of input field to process
OBS_REGRID_DATA_PLANE_VAR1_INPUT_FIELD_NAME = Z

# Level of input field to process
OBS_REGRID_DATA_PLANE_VAR1_INPUT_LEVEL = P500

# Name of output field to create
OBS_REGRID_DATA_PLANE_VAR1_OUTPUT_FIELD_NAME = Z500

# Mask to use for regridding
REGRID_DATA_PLANE_VERIF_GRID = latlon 360 90 89 0 -1.0 1.0

# Method to run regrid_data_plane, not setting this will default to NEAREST
REGRID_DATA_PLANE_METHOD = BILIN

# Regridding width used in regrid_data_plane, not setting this will default to 1
REGRID_DATA_PLANE_WIDTH = 2

# input and output data directories for each application in PROCESS_LIST
OBS_REGRID_DATA_PLANE_INPUT_DIR = /gpfs/fs1/collections/rda/data/ds627.0/ei.oper.an.pl
OBS_REGRID_DATA_PLANE_OUTPUT_DIR = {OUTPUT_BASE}/s2s/UserScript_fcstGFS_obsERA_Blocking/ERA/Regrid

# format of filenames
# Input ERA Interim
OBS_REGRID_DATA_PLANE_INPUT_TEMPLATE = {valid?fmt=%Y%m}/ei.oper.an.pl.regn128sc.{valid?fmt=%Y%m%d%H}
OBS_REGRID_DATA_PLANE_OUTPUT_TEMPLATE = {valid?fmt=%Y%m%d}/Z500_6hourly_{init?fmt=%Y%m%d%H}_NH.nc


# Perform a sum over the 4 daily times that have been regridded using pcp_combine
# 00, 06, 12, 18 UTC
[daily_mean_obs]
# Start time for METplus run
VALID_BEG = 1979120118

# End time for METplus run
VALID_END = 2017022818

# run pcp_combine on obs data
OBS_PCP_COMBINE_RUN = {OBS_RUN}

# method to run pcp_combine on forecast data
# Options are ADD, SUM, SUBTRACT, DERIVE, and USER_DEFINED
OBS_PCP_COMBINE_METHOD = DERIVE
OBS_PCP_COMBINE_STAT_LIST = MEAN

# field name and level of 1 hr accumulation in forecast files
OBS_PCP_COMBINE_INPUT_ACCUMS = 6
OBS_PCP_COMBINE_INPUT_NAMES = Z500
OBS_PCP_COMBINE_INPUT_LEVELS = "(*,*)"
OBS_PCP_COMBINE_INPUT_OPTIONS = convert(x) = x / 9.81; set_attr_valid = "{valid?fmt=%Y%m%d_%H%M%S?shift=-64800}";

# Convert output and set 24 hours as the accumulation
OBS_PCP_COMBINE_OUTPUT_ACCUM = 24
OBS_PCP_COMBINE_DERIVE_LOOKBACK = 24

# Name output variable Z500
OBS_PCP_COMBINE_OUTPUT_NAME = Z500

# Input and output Data directories for each application in PROCESS_LIST
OBS_PCP_COMBINE_INPUT_DIR = {OUTPUT_BASE}/s2s/UserScript_fcstGFS_obsERA_Blocking/ERA/Regrid
OBS_PCP_COMBINE_OUTPUT_DIR = {OUTPUT_BASE}/s2s/UserScript_fcstGFS_obsERA_Blocking/ERA/Daily

# Input and Output filename templates, ERA Interim
OBS_PCP_COMBINE_INPUT_TEMPLATE = {valid?fmt=%Y%m%d}/Z500_6hourly_{valid?fmt=%Y%m%d%H}_NH.nc
OBS_PCP_COMBINE_OUTPUT_TEMPLATE = Z500_daily_{valid?fmt=%Y%m%d?shift=-64800}_NH.nc


# Perform a 5 day running mean on the data using pcp_combine
[running_mean_obs]
# Add the first/last 2 days to the skip times to compute the running mean
SKIP_TIMES = "%m:begin_end_incr(3,11,1)", "%m%d:1201,1202,1203,1204,0229"

# run pcp_combine on obs data
OBS_PCP_COMBINE_RUN = {OBS_RUN}

# method to run pcp_combine on forecast data
# Options are ADD, SUM, SUBTRACT, DERIVE, and USER_DEFINED
OBS_PCP_COMBINE_METHOD = DERIVE
OBS_PCP_COMBINE_STAT_LIST = MEAN

# field name, level and setting time attribute of 1 hr accumulation in forecast files
OBS_PCP_COMBINE_INPUT_ACCUMS = 24
OBS_PCP_COMBINE_INPUT_NAMES = Z500
OBS_PCP_COMBINE_INPUT_LEVELS = "(*,*)"
OBS_PCP_COMBINE_INPUT_OPTIONS = set_attr_valid = "{valid?fmt=%Y%m%d_%H%M%S?shift=-172800}";

#  Running mean is 5 days
OBS_PCP_COMBINE_OUTPUT_ACCUM = 120
OBS_PCP_COMBINE_DERIVE_LOOKBACK = 120

# Set output variable name
OBS_PCP_COMBINE_OUTPUT_NAME = Z500

# input and output data directories for each application in PROCESS_LIST
OBS_PCP_COMBINE_INPUT_DIR = {OUTPUT_BASE}/s2s/UserScript_fcstGFS_obsERA_Blocking/ERA/Daily
OBS_PCP_COMBINE_OUTPUT_DIR = {OUTPUT_BASE}/s2s/UserScript_fcstGFS_obsERA_Blocking/ERA/Rmean5d

# format of filenames
# Input ERA Interim
OBS_PCP_COMBINE_INPUT_TEMPLATE = Z500_daily_{valid?fmt=%Y%m%d}_NH.nc
OBS_PCP_COMBINE_OUTPUT_TEMPLATE = Z500_5daymean_{valid?fmt=%Y%m%d?shift=-172800}_NH.nc


# Compute anomalies using the daily means and 5 day running mean using pcp_combine
[anomaly_obs]
# Add the first/last 2 days to the skip times to compute the running mean 
SKIP_TIMES = "%m:begin_end_incr(3,11,1)", "%m%d:1201,1202,0227,0228,0229"

# run pcp_combine on obs data
OBS_PCP_COMBINE_RUN = {OBS_RUN}

# method to run pcp_combine on forecast data
# Options are ADD, SUM, SUBTRACT, DERIVE, and USER_DEFINED
OBS_PCP_COMBINE_METHOD = USER_DEFINED

# User defined pcp_combine command
OBS_PCP_COMBINE_COMMAND = -subtract {OBS_PCP_COMBINE_INPUT_DIR}/Daily/Z500_daily_{valid?fmt=%Y%m%d}_NH.nc {OBS_PCP_COMBINE_INPUT_DIR}/Rmean5d/Z500_5daymean_{valid?fmt=%Y%m%d}_NH.nc -field 'name="Z500"; level="(*,*)";'

# input and output data directories for each application in PROCESS_LIST
OBS_PCP_COMBINE_INPUT_DIR = {OUTPUT_BASE}/s2s/UserScript_fcstGFS_obsERA_Blocking/ERA
OBS_PCP_COMBINE_OUTPUT_DIR = {OUTPUT_BASE}/s2s/UserScript_fcstGFS_obsERA_Blocking/ERA/Anomaly

# format of filenames
# Input ERA Interim
OBS_PCP_COMBINE_INPUT_TEMPLATE = Z500_daily_{valid?fmt=%Y%m%d}_NH.nc
OBS_PCP_COMBINE_OUTPUT_TEMPLATE = Z500_anomaly_{valid?fmt=%Y%m%d}_NH.nc


# Run the Blocking Analysis Script
[script_blocking]
# list of strings to loop over for each run time.
# value for each item can be referenced in filename templates with {custom?fmt=%s}
USER_SCRIPT_CUSTOM_LOOP_LIST = nc

# Run the user script once
USER_SCRIPT_RUNTIME_FREQ = RUN_ONCE

# Command to run the user script with input configuration file
USER_SCRIPT_COMMAND = {METPLUS_BASE}/parm/use_cases/model_applications/s2s/UserScript_obsERA_obsOnly_Blocking/Blocking_driver.py {METPLUS_BASE}/parm/use_cases/model_applications/s2s/UserScript_obsERA_obsOnly_Blocking/Blocking_obsERA_obsOnly.conf dir.INPUT_BASE={INPUT_BASE} dir.OUTPUT_BASE={OUTPUT_BASE} dir.MET_INSTALL_DIR={MET_INSTALL_DIR}

MET Configuration

METplus sets environment variables based on the values in the METplus configuration file. These variables are referenced in the MET configuration file. YOU SHOULD NOT SET ANY OF THESE ENVIRONMENT VARIABLES YOURSELF! THEY WILL BE OVERWRITTEN BY METPLUS WHEN IT CALLS THE MET TOOLS! If there is a setting in the MET configuration file that is not controlled by an environment variable, you can add additional environment variables to be set only within the METplus environment using the [user_env_vars] section of the METplus configuration files. See the ‘User Defined Config’ section on the ‘System Configuration’ page of the METplus User’s Guide for more information.

See the following files for more information about the environment variables set in this configuration file.

parm/use_cases/met_tool_wrapper/RegridDataPlane/RegridDataPlane.py parm/use_cases/met_tool_wrapper/PCPCombine/PCPCOmbine_derive.py parm/use_cases/met_tool_wrapper/PCPCombine/PCPCOmbine_subtract.py

Python Scripts

This use case uses Python scripts to perform the blocking calculation

parm/use_cases/model_applications/s2s/UserScript_obsERA_obsOnly_Blocking/Blocking_driver.py: This script calls the requested steps in the blocking analysis for a forecast, observation, or both. The possible steps are computing CBLs, plotting CBLs, computing IBLs, plotting IBLs, computing GIBLs, computing blocks, and plotting blocks.

parm/use_cases/model_applications/s2s/UserScript_obsERA_obsOnly_Blocking/Blocking.py: This script runs the requested steps, containing the code for computing CBLs, computing IBLs, computing GIBLs, and computing blocks.

parm/use_cases/model_applications/s2s/UserScript_obsERA_obsOnly_Blocking/Blocking_WeatherRegime_util.py: This script contains functions used by both the blocking anwd weather regime analysis, including the code for determining which steps the user wants to run, and finding and reading the input files in the format from the output pre-processing steps

#!/usr/bin/env python3
import sys
import os
import numpy as np
import netCDF4

from Blocking import BlockingCalculation
from metplus.util import pre_run_setup, config_metplus, get_start_end_interval_times, get_lead_sequence
from metplus.util import get_skip_times, skip_time, is_loop_by_init, ti_calculate, do_string_sub
from metplotpy.contributed.blocking_s2s import plot_blocking as pb
from metplotpy.contributed.blocking_s2s.CBL_plot import create_cbl_plot
from Blocking_WeatherRegime_util import find_input_files, parse_steps

def main():

    all_steps = ["CBL","PLOTCBL","IBL","PLOTIBL","GIBL","CALCBLOCKS","PLOTBLOCKS"]

    inconfig_list = sys.argv[1:]
    steps_list_fcst,steps_list_obs,config_list = parse_steps(inconfig_list)
    config = pre_run_setup(config_list)

    if not steps_list_obs and not steps_list_fcst:
        print('No processing steps requested for either the model or observations,')
        print('no data will be processed')


    ######################################################################
    # Blocking Calculation and Plotting
    ######################################################################
    # Set up the data
    steps_fcst = BlockingCalculation(config,'FCST')
    steps_obs = BlockingCalculation(config,'OBS')

    # Check to see if there is a plot directory
    oplot_dir = config.getstr('Blocking','BLOCKING_PLOT_OUTPUT_DIR','')
    if not oplot_dir:
        obase = config.getstr('config','OUTPUT_BASE')
        oplot_dir = obase+'/'+'plots'
    if not os.path.exists(oplot_dir):
        os.makedirs(oplot_dir)

    # Check to see if CBL's are used from an obs climatology
    use_cbl_obs = config.getbool('Blocking','USE_CBL_OBS',False)

    # Calculate Central Blocking Latitude
    cbl_config = config_metplus.replace_config_from_section(config,'Blocking')
    cbl_config_init = config.find_section('Blocking','CBL_INIT_BEG')
    cbl_config_valid = config.find_section('Blocking','CBL_VALID_BEG')
    use_init =  is_loop_by_init(cbl_config)
    if use_init:
        orig_beg = config.getstr('Blocking','INIT_BEG')
        orig_end = config.getstr('Blocking','INIT_END')
        if cbl_config_init is not None:
            config.set('Blocking','INIT_BEG',config.getstr('Blocking','CBL_INIT_BEG'))
            config.set('Blocking','INIT_END',config.getstr('Blocking','CBL_INIT_END'))
            cbl_config = config_metplus.replace_config_from_section(config, 'Blocking')
    else:
        orig_beg = config.getstr('Blocking','VALID_BEG')
        orig_end = config.getstr('Blocking','VALID_END')
        if cbl_config_valid is not None:
            config.set('Blocking','VALID_BEG',config.getstr('Blocking','CBL_VALID_BEG'))
            config.set('Blocking','VALID_END',config.getstr('Blocking','CBL_VALID_END'))
            cbl_config = config_metplus.replace_config_from_section(config, 'Blocking')

    if ("CBL" in steps_list_obs):
        print('Computing Obs CBLs')
        obs_infiles,yr_obs,mth_obs,day_obs,yr_full_obs = find_input_files(cbl_config, use_init, 
            'OBS_BLOCKING_ANOMALY_TEMPLATE')
        cbls_obs,lats_obs,lons_obs,yr_obs,mhweight_obs = steps_obs.run_CBL(obs_infiles,yr_obs)

    if ("CBL" in steps_list_fcst) and not use_cbl_obs:
        # Add in step to use obs for CBLS
        print('Computing Forecast CBLs')
        fcst_infiles,yr_fcst,mth_fcst,day_fcst,yr_full_fcst = find_input_files(cbl_config, use_init, 
            'FCST_BLOCKING_ANOMALY_TEMPLATE')
        cbls_fcst,lats_fcst,lons_fcst,yr_fcst,mhweight_fcst = steps_fcst.run_CBL(fcst_infiles,yr_fcst)
    elif ("CBL" in steps_list_fcst) and use_cbl_obs:
        if not ("CBL" in steps_list_obs):
            raise Exception('Must run observed CBLs before using them as a forecast.')
        cbls_fcst = cbls_obs
        lats_fcst = lats_obs
        lons_fcst = lons_obs
        yr_fcst = yr_obs
        mhweight_fcst = mhweight_obs

    #Plot Central Blocking Latitude
    if ("PLOTCBL" in steps_list_obs):
        if not ("CBL" in steps_list_obs):
            raise Exception('Must run observed CBLs before plotting them.')
        print('Plotting Obs CBLs')
        cbl_plot_mthstr = config.getstr('Blocking','OBS_CBL_PLOT_MTHSTR')
        cbl_plot_outname = oplot_dir+'/'+config.getstr('Blocking','OBS_CBL_PLOT_OUTPUT_NAME')
        create_cbl_plot(lons_obs, lats_obs, cbls_obs, mhweight_obs, cbl_plot_mthstr, cbl_plot_outname, 
            do_averaging=True)
    if ("PLOTCBL" in steps_list_fcst):
        if not ("CBL" in steps_list_fcst):
            raise Exception('Must run forecast CBLs before plotting them.')
        print('Plotting Forecast CBLs')
        cbl_plot_mthstr = config.getstr('Blocking','FCST_CBL_PLOT_MTHSTR')
        cbl_plot_outname = oplot_dir+'/'+config.getstr('Blocking','FCST_CBL_PLOT_OUTPUT_NAME')
        create_cbl_plot(lons_fcst, lats_fcst, cbls_fcst, mhweight_fcst, cbl_plot_mthstr, cbl_plot_outname, 
            do_averaging=True)


    # Run IBL
    if use_init:
       config.set('Blocking','INIT_BEG',orig_beg)
       config.set('Blocking','INIT_END',orig_end)
    else:
        config.set('Blocking','VALID_BEG',orig_beg)
        config.set('Blocking','VALID_END',orig_end)
    ibl_config = config_metplus.replace_config_from_section(config,'Blocking')
    if ("IBL" in steps_list_obs) and not ("IBL" in steps_list_fcst):
        if not ("CBL" in steps_list_obs):
            raise Exception('Must run observed CBLs before running IBLs.')
        print('Computing Obs IBLs')
        obs_infiles,yr_obs,mth_obs,day_obs,yr_full_obs = find_input_files(ibl_config, use_init, 'OBS_BLOCKING_TEMPLATE')
        ibls_obs = steps_obs.run_Calc_IBL(cbls_obs,obs_infiles,yr_obs)
        daynum_obs = np.arange(0,len(ibls_obs[0,:,0]),1)
    elif ("IBL" in steps_list_fcst) and not ("IBL" in steps_list_obs):
        if (not "CBL" in steps_list_fcst):
            raise Exception('Must run forecast CBLs or use observed CBLs before running IBLs.')
        print('Computing Forecast IBLs')
        fcst_infiles,yr_fcst,mth_fcst,day_fcst,yr_full_fcst = find_input_files(ibl_config, use_init, 
            'FCST_BLOCKING_TEMPLATE')
        ibls_fcst = steps_fcst.run_Calc_IBL(cbls_fcst,fcst_infiles,yr_fcst)
        daynum_fcst = np.arange(0,len(ibls_fcst[0,:,0]),1)
    elif ("IBL" in steps_list_obs) and ("IBL" in steps_list_fcst):
        if not ("CBL" in steps_list_obs) and not ("CBL" in steps_list_fcst):
            raise Exception('Must run observed and forecast CBLs before running IBLs.')
        both_infiles,yr_obs,mth_obs,day_obs,yr_full_obs = find_input_files(ibl_config, use_init, 'OBS_BLOCKING_TEMPLATE',
            secondtemplate='FCST_BLOCKING_TEMPLATE')
        obs_infiles = both_infiles[0]
        fcst_infiles = both_infiles[1]
        yr_fcst = yr_obs
        mth_fcst = mth_obs
        day_fcst = day_obs
        print('Computing Obs IBLs')
        ibls_obs = steps_obs.run_Calc_IBL(cbls_obs,obs_infiles,yr_obs)
        daynum_obs = np.arange(0,len(ibls_obs[0,:,0]),1)
        print('Computing Forecast IBLs')
        ibls_fcst = steps_fcst.run_Calc_IBL(cbls_fcst,fcst_infiles,yr_fcst)
        daynum_fcst = np.arange(0,len(ibls_fcst[0,:,0]),1)

    # Plot IBLS
    if("PLOTIBL" in steps_list_obs) and not ("PLOTIBL" in steps_list_fcst):
        if not ("IBL" in steps_list_obs):
            raise Exception('Must run observed IBLs before plotting them.')
        print('Plotting Obs IBLs')
        ibl_plot_title = config.getstr('Blocking','OBS_IBL_PLOT_TITLE','IBL Frequency')
        ibl_plot_outname = oplot_dir+'/'+config.getstr('Blocking','OBS_IBL_PLOT_OUTPUT_NAME','')
        ibl_plot_label1 = config.getstr('Blocking','IBL_PLOT_OBS_LABEL','')
        pb.plot_ibls(ibls_obs,lons_obs,ibl_plot_title,ibl_plot_outname,label1=ibl_plot_label1)
    elif ("PLOTIBL" in steps_list_fcst) and not ("PLOTIBL" in steps_list_obs):
        if not ("IBL" in steps_list_fcst):
            raise Exception('Must run forecast IBLs before plotting them.')
        print('Plotting Forecast IBLs')
        ibl_plot_title = config.getstr('Blocking','FCST_IBL_PLOT_TITLE','IBL Frequency')
        ibl_plot_outname = oplot_dir+'/'+config.getstr('Blocking','FCST_IBL_PLOT_OUTPUT_NAME')
        ibl_plot_label1 = config.getstr('Blocking','IBL_PLOT_FCST_LABEL',None)
        pb.plot_ibls(ibls_fcst,lons_fcst,ibl_plot_title,ibl_plot_outname,label1=ibl_plot_label1)
    elif ("PLOTIBL" in steps_list_obs) and ("PLOTIBL" in steps_list_fcst):
        if (not "IBL" in steps_list_obs) and (not "IBL" in steps_list_fcst):
            raise Exception('Must run forecast and observed IBLs before plotting them.')
        print('Plotting Obs and Forecast IBLs')
        ibl_plot_title = config.getstr('Blocking','IBL_PLOT_TITLE')
        ibl_plot_outname = oplot_dir+'/'+config.getstr('Blocking','IBL_PLOT_OUTPUT_NAME')
        #Check to see if there are plot legend labels
        ibl_plot_label1 = config.getstr('Blocking','IBL_PLOT_OBS_LABEL','Observation')
        ibl_plot_label2 = config.getstr('Blocking','IBL_PLOT_FCST_LABEL','Forecast')
        pb.plot_ibls(ibls_obs,lons_obs,ibl_plot_title,ibl_plot_outname,data2=ibls_fcst,lon2=lons_fcst,
            label1=ibl_plot_label1,label2=ibl_plot_label2)


    # Run GIBL
    if ("GIBL" in steps_list_obs):
        if not ("IBL" in steps_list_obs):
            raise Exception('Must run observed IBLs before running GIBLs.')
        print('Computing Obs GIBLs')
        gibls_obs = steps_obs.run_Calc_GIBL(ibls_obs,lons_obs)

    if ("GIBL" in steps_list_fcst):
        if not ("IBL" in steps_list_fcst):
            raise Exception('Must run Forecast IBLs before running GIBLs.')
        print('Computing Forecast GIBLs')
        gibls_fcst = steps_fcst.run_Calc_GIBL(ibls_fcst,lons_fcst)


    # Calc Blocks
    if ("CALCBLOCKS" in steps_list_obs):
        if not ("GIBL" in steps_list_obs):
            raise Exception('Must run observed GIBLs before calculating blocks.')
        print('Computing Blocks')
        block_freq_obs = steps_obs.run_Calc_Blocks(ibls_obs,gibls_obs,lons_obs,daynum_obs,yr_obs)

    if ("CALCBLOCKS" in steps_list_fcst):
        if not ("GIBL" in steps_list_fcst):
            raise Exception('Must run Forecast GIBLs before calculating blocks.')
        print('Computing Blocks')
        block_freq_fcst = steps_fcst.run_Calc_Blocks(ibls_fcst,gibls_fcst,lons_fcst,daynum_fcst,yr_fcst)

    # Plot Blocking Frequency
    if ("PLOTBLOCKS" in steps_list_obs):
        if not ("CALCBLOCKS" in steps_list_obs):
            raise Exception('Must compute observed blocks before plotting them.')
        print('Plotting Obs Blocks')
        blocking_plot_title = config.getstr('Blocking','OBS_BLOCKING_PLOT_TITLE')
        blocking_plot_outname = oplot_dir+'/'+config.getstr('Blocking','OBS_BLOCKING_PLOT_OUTPUT_NAME')
        pb.plot_blocks(block_freq_obs,gibls_obs,ibls_obs,lons_obs,blocking_plot_title,blocking_plot_outname)
    if ("PLOTBLOCKS" in steps_list_fcst):
        if not ("CALCBLOCKS" in steps_list_fcst):
            raise Exception('Must compute forecast blocks before plotting them.')
        print('Plotting Forecast Blocks')
        blocking_plot_title = config.getstr('Blocking','FCST_BLOCKING_PLOT_TITLE')
        blocking_plot_outname = oplot_dir+'/'+config.getstr('Blocking','FCST_BLOCKING_PLOT_OUTPUT_NAME')
        pb.plot_blocks(block_freq_fcst,gibls_fcst,ibls_fcst,lons_fcst,blocking_plot_title,blocking_plot_outname)


if __name__ == "__main__":
    main()
import os
import numpy as np
import datetime
import bisect
from scipy import stats
from scipy.signal import argrelextrema
from metplus.util import config_metplus, get_start_end_interval_times, get_lead_sequence
from metplus.util import get_skip_times, skip_time, is_loop_by_init, ti_calculate
from Blocking_WeatherRegime_util import read_nc_met

class BlockingCalculation():
    """Contains the programs to calculate Blocking via the Pelly-Hoskins Method
    """
    def __init__(self,config,label):

        self.blocking_anomaly_var = config.getstr('Blocking',label+'_BLOCKING_ANOMALY_VAR','')
        self.blocking_var = config.getstr('Blocking',label+'_BLOCKING_VAR','')
        self.smoothing_pts = config.getint('Blocking',label+'_SMOOTHING_PTS',9)
        lat_delta_in = config.getstr('Blocking',label+'_LAT_DELTA','-5,0,5')
        self.lat_delta = list(map(int,lat_delta_in.split(",")))
        self.n_s_limits = config.getint('Blocking',label+'_NORTH_SOUTH_LIMITS',30)
        self.ibl_dist = config.getint('Blocking',label+'_IBL_DIST',7)
        self.ibl_in_gibl = config.getint('Blocking',label+'_IBL_IN_GIBL',15)
        self.gibl_overlap = config.getint('Blocking',label+'_GIBL_OVERLAP',10)
        self.block_time = config.getint('Blocking',label+'_BLOCK_TIME',5)  ###Should fix so it supports other times"
        self.block_travel = config.getint('Blocking',label+'_BLOCK_TRAVEL',45)
        self.block_method = config.getstr('Blocking',label+'_BLOCK_METHOD','PH')

        # Check data requirements
        if self.smoothing_pts % 2 == 0:
            print('ERROR: Smoothing Radius must be an odd number given in grid points, Exiting...')
            exit()


    def run_CBL(self,cblinfiles,cblyrs):

        z500_anom_4d,lats,lons,yr = read_nc_met(cblinfiles,cblyrs,self.blocking_anomaly_var)

        #Create Latitude Weight based for NH
        cos = lats
        cos = cos * np.pi / 180.0
        way = np.cos(cos)
        way = np.sqrt(way)
        weightf = np.repeat(way[:,np.newaxis],360,axis=1)

        ####Find latitude of maximum high-pass STD (CBL)
        mstd = np.nanstd(z500_anom_4d,axis=1)
        mhweight = mstd * weightf
        cbli = np.argmax(mhweight,axis=1)
        CBL = np.zeros((len(z500_anom_4d[:,0,0,0]),len(lons)))
        for j in np.arange(0,len(yr),1):
            CBL[j,:] = lats[cbli[j,:]]

        ###Apply Moving Average to Smooth CBL Profiles
        lt = len(lons)
        CBLf = np.zeros((len(z500_anom_4d[:,0,0,0]),len(lons)))
        m=int((self.smoothing_pts-1)/2.0)
        for i in np.arange(0,len(CBL[0,:]),1):
            ma_indices = np.arange(i-m,i+m+1)
            ma_indices = np.where(ma_indices >= lt,ma_indices-lt,ma_indices)
            CBLf[:,i] = np.nanmean(CBL[:,ma_indices],axis=1).astype(int)
        
        return CBLf,lats,lons,yr,mhweight


    def run_mod_blocking1d(self,a,cbl,lat,lon,meth):
        lat_d = self.lat_delta
        dc = (90 - cbl).astype(int)
        db = self.n_s_limits
        BI = np.zeros((len(a[:,0,0]),len(lon)))
        blon = np.zeros((len(a[:,0,0]),len(lon)))
        if meth=='PH':
            # loop through days
            for k in np.arange(0,len(a[:,0,0]),1):
                blontemp=0
                q=0
                BI1=np.zeros((len(lat_d),len(lon)))
                for l in lat_d:
                    blon1 = np.zeros(len(lon))
                    d0 = dc-l
                    dn = ((dc - 1*db/2) - l).astype(np.int64)
                    ds = ((dc + 1*db/2) - l).astype(np.int64)
                    GHGS = np.zeros(len(cbl))
                    GHGN = np.zeros(len(cbl))
                    for jj in np.arange(0,len(cbl),1):
                        GHGN[jj] = np.mean(a[k,dn[jj]:d0[jj]+1,jj])
                        GHGS[jj] = np.mean(a[k,d0[jj]:ds[jj]+1,jj])
                    BI1[q,:] = GHGN-GHGS
                    q = q +1
                BI1 = np.max(BI1,axis=0)
                block = np.where((BI1>0))[0]
                blon1[block]=1
                blontemp = blontemp + blon1
                BI[k,:] = BI1
                blon[k,:] = blontemp

        return blon,BI


    def run_Calc_IBL(self,cbl,iblinfiles,iblyr):

        z500_daily,lats,lons,yr = read_nc_met(iblinfiles,iblyr,self.blocking_var)

        #Initilize arrays for IBLs and the blocking index
        # yr, day, lon
        blonlong = np.zeros((len(yr),len(z500_daily[0,:,0,0]),len(lons)))
        BI = np.zeros((len(yr),len(z500_daily[0,:,0,0]),len(lons)))

        #Using long-term mean CBL and acsessing module of mod.py
        cbl = np.nanmean(cbl,axis=0)
        for i in np.arange(0,len(yr),1):
            blon,BI[i,:,:] = self.run_mod_blocking1d(z500_daily[i,:,:,:],cbl,lats,lons,self.block_method)
            blonlong[i,:,:] = blon

        return blonlong


    def run_Calc_GIBL(self,ibl,lons):

        #Initilize GIBL Array
        GIBL = np.zeros(np.shape(ibl))

        #####Loop finds IBLs within 7 degree of each other creating one group. Finally
        ##### A GIBL exist if it has more than 15 IBLs
        crit = self.ibl_in_gibl

        for i in np.arange(0,len(GIBL[:,0,0]),1):
            for k in np.arange(0,len(GIBL[0,:,0]),1):
                gibli = np.zeros(len(GIBL[0,0,:]))
                thresh = crit/2.0
                a = ibl[i,k,:]
                db = self.ibl_dist
                ibli = np.where(a==1)[0]
                if len(ibli)==0:
                    continue
                idiff = ibli[1:] - ibli[:-1]
                bt=0
                btlon = ibli[0]
                ct = 1
                btfin = []
                block = ibli
                block = np.append(block,block+360)
                for ll in np.arange(1,len(block),1):
                    diff = np.abs(block[ll] - block[ll-1])
                    if diff == 1:
                        bt = [block[ll]]
                        btlon = np.append(btlon,bt)
                        ct = ct + diff
                    if diff <= thresh and diff != 1:
                        bt = np.arange(block[ll-1]+1,block[ll]+1,1)
                        btlon = np.append(btlon,bt)
                        ct = ct + diff
                    if diff > thresh or ll==(len(block)-1):
                        if ct >= crit:
                            btfin = np.append(btfin,btlon)
                            ct=1
                        ct = 1
                        btlon = block[ll]
                if len(btfin)/2 < crit :
                    btfin = []
                if len(btfin)==0:
                    continue
                gibl1 = btfin
                temp = np.where(gibl1>=360)[0]
                gibl1[temp] = gibl1[temp] - 360
                gibli[gibl1.astype(int)] = 1
                GIBL[i,k,:] = gibli

        return GIBL


    def Remove(self,duplicate):
        final_list = []
        for num in duplicate:
            if num not in final_list:
                final_list.append(num)
        return final_list


    def run_Calc_Blocks(self,ibl,GIBL,lon,tsin,year):

        crit = self.ibl_in_gibl

        ##Count up the blocked longitudes for each GIBL
        c = np.zeros((GIBL.shape))
        lonlen = len(lon)
        sz = []
        mx = []
        min = []

        for y in np.arange(0,len(GIBL[:,0,0]),1):
            for k in np.arange(0,len(GIBL[0,:,0]),1):
                a = GIBL[y,k] # Array of lons for each year,day
                ct=1
                ai = np.where(a==1)[0]
                ai = np.append(ai,ai+360)
                temp = np.where(ai>=360)[0]
                bi=list(ai)
                bi = np.array(bi)
                bi[temp] = bi[temp]-360
                # Loop through the lons that are part of the GIBL
                for i in np.arange(0,len(ai)-1,1):
                    diff = ai[i+1] - ai[i]
                    c[y,k,bi[i]] = ct
                    if diff==1:
                        ct=ct+1
                    else:
                        sz = np.append(sz,ct)
                        ct=1

        ########## - finding where the left and right limits of the block are - ################
        for i in np.arange(0,len(c[:,0,0]),1):
            for k in np.arange(0,len(c[0,:,0]),1):
                maxi = argrelextrema(c[i,k],np.greater,mode='wrap')[0]
                mini = np.where(c[i,k]==1)[0]
                if c[i,k,lonlen-1]!=0 and c[i,k,0]!=0:
                    mm1 = mini[-1]
                    mm2 = mini[:-1]
                    mini = np.append(mm1,mm2)
                mx = np.append(mx,maxi)
                min = np.append(min,mini)

        locy, locd, locl = np.where(c==crit)

        A = np.zeros(lonlen)
        A = np.expand_dims(A,axis=0)

        ################# - Splitting up each GIBL into its own array - ###################

        for i in np.arange(0,len(locy),1):
            m = locy[i]   #year
            n = locd[i]   #day
            o = locl[i]   #long where 15
            mm = int(mx[i])
            mn = min[i]
            temp1 = GIBL[m,n]
            temp2 = np.zeros(lonlen)
            if mn>mm:
                diff = int(mm - c[m,n,mm] + 1)
                lons = lon[diff]
                place1 = np.arange(lons,lonlen,1)
                place2 = np.arange(0,mm+1,1)
                bl = np.append(place2,place1).astype(int)
            if temp1[lonlen-1] ==1 and mm>200:
                lons = lon[mm]
                beg = mm - c[m,n,mm] + 1
                bl = np.arange(beg,mm+1,1).astype(int)
            if mm>mn: #temp1[359] ==0:
                lons = lon[mm]
                beg = mm - c[m,n,mm] + 1
                bl = np.arange(beg,mm+1,1).astype(int)
            temp2[bl] = 1
            temp2 = np.expand_dims(temp2,axis=0)
            A = np.concatenate((A,temp2),axis=0)

        A = A[1:]

        ######### - Getting rid of non-consectutve Time steps which would prevent blocking - #################
        dd=[]
        dy = []
        dA = A[0]
        dA = np.expand_dims(dA,axis=0)
        ct=0
        for i in np.arange(1,len(locy),1):
            dd1 = locd[i-1]
            dd2 = locd[i]
            if dd2-dd1 > 2:
                ct = 0
                continue
            if ct == 0:
                dd = np.append(dd,locd[i-1])
                dy = np.append(dy,locy[i-1])
                temp2 = np.expand_dims(A[i-1],axis=0)
                dA = np.concatenate((dA,temp2),axis=0)
                ct = ct + 1
            if dd2-dd1<=2:
                dd=np.append(dd,locd[i])
                dy = np.append(dy,locy[i])
                temp2 = np.expand_dims(A[i],axis=0)
                dA = np.concatenate((dA,temp2),axis=0)
                ct = ct + 1

        dA=dA[1:]
        dAfin = dA

        ############ - Finding center longitude of block - ##############
        middle=[]
        for l in np.arange(0,len(dAfin),1):
            temp = np.where(dAfin[l]==1)[0]
            if len(temp) % 2 == 0:
                temp = np.append(temp,0)
            midtemp = np.median(temp)
            middle = np.append(middle,midtemp)


        #####Track blocks. Makes sure that blocks overlap with at least 10 longitude points on consecutive
        overlap = self.gibl_overlap
        btime = self.block_time
        fin = [[]]
        finloc = [[]]
        ddcopy=dd*1.0
        noloc=[]
        failloc = [[]]
        for i in np.arange(0,len(c[:,0,0]),1):
            yri = np.where(dy==i)[0]
            B = [[]]
            ddil =1.0 * ddcopy[yri]
            dyy = np.where(dy==i)[0]
            rem = []
            for dk in ddil:
                if len(ddil) < btime:
                    continue
                ddil = np.append(ddil[0]-1,ddil)
                diff = np.diff(ddil)
                diffB=[]
                dB =1
                cnt = 1
                while dB<=2:
                    diffB = np.append(diffB,ddil[cnt])
                    dB = diff[cnt-1]
                    if ddil[cnt]==ddil[-1]:
                        dB=5
                    cnt=cnt+1
                diffB = np.array(self.Remove(diffB))
                locb = []
                for ll in diffB:
                    locb = np.append(locb,np.where((dy==i) & (dd==ll))[0])
                ddil=ddil[1:]
                locbtemp = 1.0*locb
                ree=np.empty(0,dtype=int)
                for hh in np.arange(0,len(noloc),1):
                    ree = np.append(ree,np.where(locb == noloc[hh])[0])
                ree.astype(int)
                locbtemp = np.delete(locbtemp,ree)
                locb=locbtemp * 1.0
                datemp = dAfin[locb.astype(int)]
                blocktemp = [[datemp[0]]]
                locbi = np.array([locb[0]])
                ll1=0
                pass1 = 0
                ai=[0]
                add=0
                for ll in np.arange(0,len(locb)-1,1):
                    if ((dd[locb[ll+1].astype(int)] - dd[locb[ll1].astype(int)]) >=1) & ((dd[locb[ll+1].astype(int)] - dd[locb[ll1].astype(int)]) <=2):
                        add = datemp[ll1] + datemp[ll+1]
                    ai = np.where(add==2)[0]
                    if len(ai)>overlap:
                        locbi=np.append(locbi,locb[ll+1])
                        ll1=ll+1
                        add=0
                    if (len(ai)<overlap):
                        add=0
                        continue
                if len(locbi)>4:
                    noloc = np.append(noloc,locbi)
                    finloc = finloc + [locbi]
                    for jj in locbi:
                        rem = np.append(rem,np.where(dyy==jj)[0])
                    ddil = np.delete(ddcopy[yri],rem.astype(int))
                if len(locbi)<=4:
                    noloc = np.append(noloc,locbi)
                    if len(locbi)<=2:
                        failloc=failloc+[locbi]
                    for jj in locbi:
                        rem = np.append(rem,np.where(dyy==jj)[0])
                    ddil = np.delete(ddcopy[yri],rem.astype(int))

        blocks = finloc[1:]
        noblock = failloc[1:]

        ############Get rid of blocks that travel downstream##########
        ######If center of blocks travel downstream further than 45 degrees longitude, 
        ######cancel block moment it travels out of this limit
        newblock = [[]]
        newnoblock=[[]]
        distthresh = self.block_travel
        for bb in blocks:
            diffb = []
            start = middle[bb[0].astype(int)]
            for bbs in bb:
                diffb = np.append(diffb, start - middle[bbs.astype(int)])
            loc = np.where(np.abs(diffb) > distthresh)[0]
            if len(loc)==0:
                newblock = newblock +[bb]
            if len(loc)>0:
                if len(bb[:loc[0]]) >4:
                    newblock = newblock + [bb[:loc[0]]]
                if len(bb[:loc[0]]) <=2:
                    noblock = noblock + [bb]

        blocks = newblock[1:]

        #Create a final array with blocking longitudes. Similar to IBL/GIBL, but those that pass the duration threshold
        blockfreq = np.zeros((len(year),len(ibl[0,:,0]),360))
        savecbl=[]
        savemiddle = []
        saveyr=[]
        numblock=0
        for i in np.arange(0,len(blocks),1):
            temp = blocks[i]
            numblock=numblock+1
            for j in temp:
                j=int(j)
                daycomp = int(dd[j])
                yearcomp = int(dy[j])
                saveyr = np.append(saveyr,dy[j])
                middlecomp = middle[j].astype(int)
                mc1 = int(round(middlecomp / 2.5))
                blockfreq[yearcomp,daycomp] = blockfreq[yearcomp,daycomp] + dAfin[j]
                ct = ct + 1

        return blockfreq
import os
import netCDF4
import numpy as np
from metplus.util import pre_run_setup, config_metplus, get_start_end_interval_times, get_lead_sequence
from metplus.util import get_skip_times, skip_time, is_loop_by_init, ti_calculate, do_string_sub

def parse_steps(config_list):

    steps_config_part_fcst = [s for s in config_list if "FCST_STEPS" in s]
    steps_list_fcst = []

    steps_config_part_obs = [s for s in config_list if "OBS_STEPS" in s]
    steps_list_obs = []

    # Setup the Steps
    if steps_config_part_fcst:
        steps_param_fcst = steps_config_part_fcst[0].split("=")[1]
        steps_list_fcst = steps_param_fcst.split("+")
        config_list.remove(steps_config_part_fcst[0])
    if steps_config_part_obs:
        steps_param_obs = steps_config_part_obs[0].split("=")[1]
        steps_list_obs = steps_param_obs.split("+")
        config_list.remove(steps_config_part_obs[0])

    config = pre_run_setup(config_list)
    if not steps_config_part_fcst:
        steps_param_fcst = config.getstr('config','FCST_STEPS','')
        steps_list_fcst = steps_param_fcst.split("+")
    if not steps_config_part_obs:
        steps_param_obs = config.getstr('config','OBS_STEPS','')
        steps_list_obs = steps_param_obs.split("+")

    return steps_list_fcst, steps_list_obs, config_list 


def find_input_files(inconfig, use_init, intemplate, secondtemplate=''):
    loop_time, end_time, time_interval = get_start_end_interval_times(inconfig)
    skip_times = get_skip_times(inconfig)

    start_mth = loop_time.strftime('%m')
    template = inconfig.getraw('config',intemplate)
    if secondtemplate:
        template2 = inconfig.getraw('config',secondtemplate)
        file_list2 = []

    file_list = []
    yr_list = []
    mth_list = []
    day_list = []
    yr_full_list = []
    if use_init:
        timname = 'init'
    else:
        timname = 'valid'
    input_dict = {}
    input_dict['loop_by'] = timname
    pmth = start_mth
    while loop_time <= end_time:
        lead_seq = get_lead_sequence(inconfig)
        for ls in lead_seq:
            new_time = loop_time + ls
            input_dict[timname] = loop_time
            input_dict['lead'] = ls

            outtimestuff = ti_calculate(input_dict)
            if skip_time(outtimestuff, skip_times):
                continue
            cmth = outtimestuff['valid'].strftime('%m')
            filepath = do_string_sub(template, **outtimestuff)
            mth_list.append(cmth)
            day_list.append(outtimestuff['valid'].strftime('%d'))
            yr_full_list.append(outtimestuff['valid'].strftime('%Y'))
            if secondtemplate:
                filepath2 = do_string_sub(template2, **outtimestuff)
                if os.path.exists(filepath) and os.path.exists(filepath2):
                    file_list.append(filepath)
                    file_list2.append(filepath2)
                else:
                    file_list.append('')
                    file_list2.append('')
            else:
                if os.path.exists(filepath):
                    file_list.append(filepath)
                else:
                    file_list.append('')

            if (int(cmth) == int(start_mth)) and (int(pmth) != int(start_mth)):
                yr_list.append(int(outtimestuff['valid'].strftime('%Y')))
            pmth = cmth

        loop_time += time_interval

    if secondtemplate:
        file_list = [file_list,file_list2]
    yr_list.append(int(outtimestuff['valid'].strftime('%Y')))

    if all('' == fn for fn in file_list):
        raise Exception('No input files found as given: '+template)

    return file_list, yr_list, mth_list, day_list, yr_full_list

def read_nc_met(infiles,yrlist,invar):

    print("Reading in Data")

    #Find the first non empty file name so I can get the variable sizes
    locin = next(sub for sub in infiles if sub)
    indata = netCDF4.Dataset(locin)
    lats = indata.variables['lat'][:]
    lons = indata.variables['lon'][:]
    invar_arr = indata.variables[invar][:]
    indata.close()

    var_3d = np.empty([len(infiles),len(invar_arr[:,0]),len(invar_arr[0,:])])

    for i in range(0,len(infiles)):

        #Read in the data
        if infiles[i]:
            indata = netCDF4.Dataset(infiles[i])
            new_invar = indata.variables[invar][:]
            init_time_str = indata.variables[invar].getncattr('init_time')
            valid_time_str = indata.variables[invar].getncattr('valid_time')
            indata.close()
        else:
            new_invar = np.empty((1,len(var_3d[0,:,0]),len(var_3d[0,0,:])),dtype=np.float)
            new_invar[:] = np.nan
        var_3d[i,:,:] = new_invar

    yr = np.array(yrlist)
    if len(var_3d[:,0,0])%float(len(yrlist)) != 0:
        lowval = int(len(var_3d[:,0,0])/float(len(yrlist)))
        newarrlen = (lowval+1) * float(len(yrlist))
        arrexp = int(newarrlen - len(var_3d[:,0,0]))
        arrfill = np.empty((arrexp,len(var_3d[0,:,0]),len(var_3d[0,0,:])),dtype=np.float)
        arrfill[:] = np.nan
        var_3d = np.append(var_3d,arrfill,axis=0)
    sdim = len(var_3d[:,0,0])/float(len(yrlist))
    var_4d = np.reshape(var_3d,[len(yrlist),int(sdim),len(var_3d[0,:,0]),len(var_3d[0,0,:])])

    return var_4d,lats,lons,yr

Running METplus

This use case is run in the following ways:

  1. Passing in UserScript_obsERA_obsOnly_Blocking.py then a user-specific system configuration file:

    master_metplus.py -c /path/to/METplus/parm/use_cases/model_applications/s2s/UserScript_obsERA_obsOnly_Blocking.py -c /path/to/user_system.conf
    
  2. Modifying the configurations in parm/metplus_config, then passing in UserScript_obsERA_obsOnly_Blocking.py:

    master_metplus.py -c /path/to/METplus/parm/use_cases/model_applications/s2s/UserScript_obsERA_obsOnly_Blocking.py
    

The following variables must be set correctly:

  • INPUT_BASE - Path to directory where sample data tarballs are unpacked (See Datasets section to obtain tarballs). This is not required to run METplus, but it is required to run the examples in parm/use_cases

  • OUTPUT_BASE - Path where METplus output will be written. This must be in a location where you have write permissions

  • MET_INSTALL_DIR - Path to location where MET is installed locally

Example User Configuration File:

[dir]
INPUT_BASE = /path/to/sample/input/data
OUTPUT_BASE = /path/to/output/dir
MET_INSTALL_DIR = /path/to/met-X.Y

Expected Output

Refer to the value set for OUTPUT_BASE to find where the output data was generated. Output for this use case will be found in model_applications/s2s/Blocking (relative to OUTPUT_BASE) and will contain output for the steps requested. This may include the regridded data, daily averaged files, running mean files, and anomaly files. In addition, output CBL, IBL, and Blocking frequency plots can be generated. The location of these output plots can be specified as BLOCKING_PLOT_OUTPUT_DIR. If it is not specified, plots will be sent to model_applications/s2s/Blocking/plots (relative to OUTPUT_BASE).

Keywords

sphinx_gallery_thumbnail_path = ‘_static/s2s-OBS_ERA_blocking_frequency.png’

Total running time of the script: ( 0 minutes 0.000 seconds)

Gallery generated by Sphinx-Gallery