UserScript: Make MaKE-MaKI plot from calculated MaKE and MaKI indices

model_applications/ s2s_mjo/ UserScript_obsCFSR_obsOnly_MJO_ENSO.py

Scientific Objective

To compute the MJO-Kelvin wave-ENSO (MaKE) and MJO-Kelvin wave-Influence (MaKI) indices* using the zonal and meridional components of winds tress (TAUX,TAUY), zonal and meridional components of surface ocean currents (UCUR,VCUR), and sea surface temperature (SST). Specifically, MaKE and MaKI indices are computed using TAUX, TAUY, UCUR, VCUR and SST data between 30S and 30N and 125E and 80W. Daily anomalies of wind stress components are filtered for 30-90 days using a Convolutional Neural Network (CNN)-based filter. The weights of the filter are computed offline. The bandpass filtered wind stress components are projected onto 4 Empirical Orthogonal Functions (EOFs) data. The obtained timeseries (PCs) are standardized and combined with the EOFs to obtain the MJO component of the surface wind stress (TAUX_MJO,TAUY_MJO). UCUR and VCUR daily anomalies are multiplied by the meridional structure of Kelvin wave (UCUR_K,VCUR_K). Windpower due to the MJO component of the wind stress and oceanic Kelvin waves (W_MJO,K) is then computed as TAUX_MJO*UCUR_K+TAUY_MJO*VCUR_K. The standardized windpower and SST are projected onto the first two multivariate EOFs of W_MJO,K and SST. The resulting daily time series (PCs) are normalized and used to compute monthly values of MaKE and MaKI. Monthly values of MaKE and MaKI are saved into a text (.csv) file and plotted as time series.

  • Lybarger, N.D., C.-S. Shin, and C. Stan, 2020: MJO Wind energy and prediction of El Nino, Journal of Geophysical Research - Oceans, 125, e2020JC016732. doi:10.1029/2020JC016732

Datasets

  • Forecast dataset: None

  • Observation dataset: CFSR Reanalysis

External Dependencies

You will need to use a version of Python 3.6+ that has the following packages installed:

* numpy
* netCDF4
* datetime
* xarray
* matplotlib
* pandas

If the version of Python used to compile MET did not have these libraries at the time of compilation, you will need to add these packages or create a new Python environment with these packages.

If this is the case, you will need to set the MET_PYTHON_EXE environment variable to the path of the version of Python you want to use. If you want this version of Python to only apply to this use case, set it in the [user_env_vars] section of a METplus configuration file.:

[user_env_vars] MET_PYTHON_EXE = /path/to/python/with/required/packages/bin/python

METplus Components

This use case runs the MJO-ENSO driver, which first computes the MJO components of taux and tauy, then the MJO wind power, the MJO-ENSO indices, their plot. Inputs to the MJO-ENSO driver include netCDF files that are in MET’s netCDF version. In addition, a text file containing the listing of these input netCDF files for taux, tauy, u, v, and SST is required. Some optional pre-processing steps include RegridDataPlane for regridding the data.

METplus Workflow

The MJO-ENSO driver script python code is run for each lead time on the forecast and observations data. This example loops by valid time for the model pre-processing, and valid time for the other steps. This version is set to only process the regridding, and MaKE and MaKI calculation, omitting the caluclation of the mean daily annucal cycle and daily anomalies pre-processing steps. However, the configurations for pre-processing are available for user reference.

METplus Configuration

METplus first loads all of the configuration files found in parm/metplus_config, then it loads any configuration files passed to METplus via the command line i.e. parm/use_cases/model_applications/s2s_mjo/UserScript_obsCFSR_obsOnly_MJO_ENSO.conf. The file UserScript_obsCFSR_obsOnly_MJO_ENSO/mjo_enso_driver.py runs the python program and UserScript_obsCFSR_obsOnly_MJO_ENSO.conf sets the variables for all steps of the MJO-ENSO use case.

[config]

# Documentation for this use case can be found at
# https://metplus.readthedocs.io/en/latest/generated/model_applications/s2s_mjo/UserScript_obsCFSR_obsOnly_MJO_ENSO.html

# For additional information, please see the METplus Users Guide.
# https://metplus.readthedocs.io/en/latest/Users_Guide

###
# Processes to run
# https://metplus.readthedocs.io/en/latest/Users_Guide/systemconfiguration.html#process-list
###

# All steps, including creating daily means and mean daily annual cycle
#PROCESS_LIST = RegridDataPlane(regrid_obs_taux), RegridDataPlane(regrid_obs_tauy), RegridDataPlane(regrid_obs_sst), RegridDataPlane(regrid_obs_ucur),  RegridDataPlane(regrid_obs_vcur), UserScript(script_mjo_enso)
# Computing regridding, and MJO ENSO Analysis script
#PROCESS_LIST = RegridDataPlane(regrid_obs_taux), RegridDataPlane(regrid_obs_tauy), RegridDataPlane(regrid_obs_sst), RegridDataPlane(regrid_obs_ucur),  RegridDataPlane(regrid_obs_vcur), UserScript(script_mjo_enso)

PROCESS_LIST = UserScript(script_mjo_enso)


###
# Time Info
# LOOP_BY options are INIT, VALID, RETRO, and REALTIME
# If set to INIT or RETRO:
#   INIT_TIME_FMT, INIT_BEG, INIT_END, and INIT_INCREMENT must also be set
# If set to VALID or REALTIME:
#   VALID_TIME_FMT, VALID_BEG, VALID_END, and VALID_INCREMENT must also be set
# LEAD_SEQ is the list of forecast leads to process
# https://metplus.readthedocs.io/en/latest/Users_Guide/systemconfiguration.html#timing-control
###


LOOP_BY = VALID
VALID_TIME_FMT = %Y%m%d
VALID_BEG = 19900101
VALID_END = 20211231
VALID_INCREMENT = 86400

LEAD_SEQ = 0


# Run the obs for these cases
OBS_RUN = True
FCST_RUN = False


###
# RegridDataPlane Settings
# https://metplus.readthedocs.io/en/latest/Users_Guide/wrappers.html#regriddataplane
###

# Mask to use for regridding
REGRID_DATA_PLANE_VERIF_GRID = latlon 156 61 -30 125 1 1  

# Method to run regrid_data_plane, not setting this will default to NEAREST
REGRID_DATA_PLANE_METHOD = NEAREST

# Regridding width used in regrid_data_plane, not setting this will default to 1
REGRID_DATA_PLANE_WIDTH = 1


###
# RegridDataPlane(regrid_obs_taux) Settings
# https://metplus.readthedocs.io/en/latest/Users_Guide/wrappers.html#regriddataplane
###

# Configurations for regrid_data_plane: Regrid OLR to -15 to 15 latitude
[regrid_obs_taux]
# Run regrid_data_plane on forecast data
OBS_REGRID_DATA_PLANE_RUN = {OBS_RUN}

# If true, process each field individually and write a file for each
# If false, run once per run time passing in all fields specified
REGRID_DATA_PLANE_ONCE_PER_FIELD = False

# Name of input field to process
OBS_REGRID_DATA_PLANE_VAR1_NAME = uflx 

# Name of output field to create
OBS_REGRID_DATA_PLANE_VAR1_OUTPUT_FIELD_NAME = uflx 

# input and output data directories for each application in PROCESS_LIST
OBS_REGRID_DATA_PLANE_INPUT_DIR ={INPUT_BASE}/zonalWindStress/ 
OBS_REGRID_DATA_PLANE_OUTPUT_DIR = {OUTPUT_BASE}/s2s_mjo/UserScript_obsCFSR_obsOnly_MJO_ENSO/Regrid/zonalWindStress/

# format of filenames
# Input CFSR  
OBS_REGRID_DATA_PLANE_INPUT_TEMPLATE = cfsr_zonalWindStress_{valid?fmt=%Y%m%d}.nc
OBS_REGRID_DATA_PLANE_OUTPUT_TEMPLATE =cfsr_zonalWindStress_{valid?fmt=%Y%m%d}.nc


###
# RegridDataPlane(regrid_obs_tauy) Settings
# https://metplus.readthedocs.io/en/latest/Users_Guide/wrappers.html#regriddataplane
###

# Configurations for regrid_data_plane: Regrid meridional wind stress
[regrid_obs_tauy]
# Run regrid_data_plane on forecast data
OBS_REGRID_DATA_PLANE_RUN = {OBS_RUN}

# If true, process each field individually and write a file for each
# If false, run once per run time passing in all fields specified
REGRID_DATA_PLANE_ONCE_PER_FIELD = False

# Name of input field to process
OBS_REGRID_DATA_PLANE_VAR1_NAME = vflx 

# Name of output field to create
OBS_REGRID_DATA_PLANE_VAR1_OUTPUT_FIELD_NAME = vflx 

# input and output data directories for each application in PROCESS_LIST
OBS_REGRID_DATA_PLANE_INPUT_DIR ={INPUT_BASE}/meridionalWindStress/ 
OBS_REGRID_DATA_PLANE_OUTPUT_DIR = {OUTPUT_BASE}/s2s_mjo/UserScript_obsCFSR_obsOnly_MJO_ENSO/Regrid/meridionalWindStress/

# format of filenames
# Input CFSR 
OBS_REGRID_DATA_PLANE_INPUT_TEMPLATE = cfsr_meridionalWindStress_{valid?fmt=%Y%m%d}.nc
OBS_REGRID_DATA_PLANE_OUTPUT_TEMPLATE = cfsr_meridionalWindStress_{valid?fmt=%Y%m%d}.nc


###
# RegridDataPlane(regrid_obs_sst) Settings
# https://metplus.readthedocs.io/en/latest/Users_Guide/wrappers.html#regriddataplane
###

# Configurations for regrid_data_plane: Regrid sst 
[regrid_obs_sst]
# Run regrid_data_plane on forecast data
OBS_REGRID_DATA_PLANE_RUN = {OBS_RUN}

# If true, process each field individually and write a file for each
# If false, run once per run time passing in all fields specified
REGRID_DATA_PLANE_ONCE_PER_FIELD = False

# Name of input field to process
OBS_REGRID_DATA_PLANE_VAR1_NAME =sst 

# Name of output field to create
OBS_REGRID_DATA_PLANE_VAR1_OUTPUT_FIELD_NAME = sst 

# input and output data directories for each application in PROCESS_LIST
OBS_REGRID_DATA_PLANE_INPUT_DIR ={INPUT_BASE}/sst/ 
OBS_REGRID_DATA_PLANE_OUTPUT_DIR = {OUTPUT_BASE}/s2s_mjo/UserScript_obsCFSR_obsOnly_MJO_ENSO/Regrid/sst/

OBS_REGRID_DATA_PLANE_INPUT_TEMPLATE = cfsr_sst_{valid?fmt=%Y%m%d}.nc
OBS_REGRID_DATA_PLANE_OUTPUT_TEMPLATE = cfsr_sst_{valid?fmt=%Y%m%d}.nc


###
# RegridDataPlane(regrid_obs_ucur) Settings
# https://metplus.readthedocs.io/en/latest/Users_Guide/wrappers.html#regriddataplane
###

# Configurations for regrid_data_plane: Regrid zonal ocean current  
[regrid_obs_ucur]
# Run regrid_data_plane on forecast data
OBS_REGRID_DATA_PLANE_RUN = {OBS_RUN}

# If true, process each field individually and write a file for each
# If false, run once per run time passing in all fields specified
REGRID_DATA_PLANE_ONCE_PER_FIELD = False

# Name of input field to process
OBS_REGRID_DATA_PLANE_VAR1_NAME = u 

# Name of output field to create
OBS_REGRID_DATA_PLANE_VAR1_OUTPUT_FIELD_NAME = u 

# input and output data directories for each application in PROCESS_LIST
OBS_REGRID_DATA_PLANE_INPUT_DIR ={INPUT_BASE}/zonalOceanCurrent/ 
OBS_REGRID_DATA_PLANE_OUTPUT_DIR = {OUTPUT_BASE}/s2s_mjo/UserScript_obsCFSR_obsOnly_MJO_ENSO/Regrid/zonalOceanCurrent/

OBS_REGRID_DATA_PLANE_INPUT_TEMPLATE = cfsr_zonalOceanCurrent_{valid?fmt=%Y%m%d}.nc
OBS_REGRID_DATA_PLANE_OUTPUT_TEMPLATE = cfsr_zonalOceanCurrent_{valid?fmt=%Y%m%d}.nc


###
# RegridDataPlane(regrid_obs_vcur) Settings
# https://metplus.readthedocs.io/en/latest/Users_Guide/wrappers.html#regriddataplane
###

# Configurations for regrid_data_plane: Regrid meridional ocean current 
[regrid_obs_vcur]
# Run regrid_data_plane on forecast data
OBS_REGRID_DATA_PLANE_RUN = {OBS_RUN}

# If true, process each field individually and write a file for each
# If false, run once per run time passing in all fields specified
REGRID_DATA_PLANE_ONCE_PER_FIELD = False

# Name of input field to process
OBS_REGRID_DATA_PLANE_VAR1_NAME = v 

# Name of output field to create
OBS_REGRID_DATA_PLANE_VAR1_OUTPUT_FIELD_NAME = v 

# input and output data directories for each application in PROCESS_LIST
OBS_REGRID_DATA_PLANE_INPUT_DIR ={INPUT_BASE}/meridionalOceanCurrent/ 
OBS_REGRID_DATA_PLANE_OUTPUT_DIR = {OUTPUT_BASE}/s2s_mjo/UserScript_obsCFSR_obsOnly_MJO_ENSO/Regrid/meridionalOceanCurrent/

# format of filenames
# Input CFSR 
OBS_REGRID_DATA_PLANE_INPUT_TEMPLATE = cfsr_meridionalOceanCurrent_{valid?fmt=%Y%m%d}.nc
OBS_REGRID_DATA_PLANE_OUTPUT_TEMPLATE = cfsr_meridionalOceanCurrent_{valid?fmt=%Y%m%d}.nc


###
# UserScript(script_mjo_enso) Settings
# https://metplus.readthedocs.io/en/latest/Users_Guide/wrappers.html#userscript
###

# Configurations for UserScript: Run the MJO_ENSO Analysis driver
[script_mjo_enso]
# list of strings to loop over for each run time.
# Run the user script once per lead
USER_SCRIPT_RUNTIME_FREQ = RUN_ONCE_PER_LEAD

# Template of filenames to input to the user-script
#USER_SCRIPT_INPUT_TEMPLATE = {OUTPUT_BASE}/s2s_mjo/UserScript_obsCFSR_obsOnly_MJO_ENSO/Regrid/zonalWindStress/cfsr_zonalWindStress_{valid?fmt=%Y%m%d}.nc,{OUTPUT_BASE}/s2s_mjo/UserScript_obsCFSR_obsOnly_MJO_ENSO/Regrid/meridionalWindStress/cfsr_meridionalWindStress_{valid?fmt=%Y%m%d}.nc,{OUTPUT_BASE}/s2s_mjo/UserScript_obsCFSR_obsOnly_MJO_ENSO/Regrid/sst/cfsr_sst_{valid?fmt=%Y%m%d}.nc,{OUTPUT_BASE}/s2s_mjo/UserScript_obsCFSR_obsOnly_MJO_ENSO/Regrid/zonalOceanCurrent/cfsr_zonalOceanCurrent_{valid?fmt=%Y%m%d}.nc,{OUTPUT_BASE}/s2s_mjo/UserScript_obsCFSR_obsOnly_MJO_ENSO/Regrid/meridionalOceanCurrent/cfsr_meridionalOceanCurrent_{valid?fmt=%Y%m%d}.nc

USER_SCRIPT_INPUT_TEMPLATE = {INPUT_BASE}/model_applications/s2s_mjo/UserScript_obsCFSR_obsOnly_MJO_ENSO/zonalWindStress/cfsr_zonalWindStress_{valid?fmt=%Y%m%d}.nc,{INPUT_BASE}/model_applications/s2s_mjo/UserScript_obsCFSR_obsOnly_MJO_ENSO/meridionalWindStress/cfsr_meridionalWindStress_{valid?fmt=%Y%m%d}.nc,{INPUT_BASE}/model_applications/s2s_mjo/UserScript_obsCFSR_obsOnly_MJO_ENSO/sst/cfsr_sst_{valid?fmt=%Y%m%d}.nc,{INPUT_BASE}/model_applications/s2s_mjo/UserScript_obsCFSR_obsOnly_MJO_ENSO/zonalOceanCurrent/cfsr_zonalOceanCurrent_{valid?fmt=%Y%m%d}.nc,{INPUT_BASE}/model_applications/s2s_mjo/UserScript_obsCFSR_obsOnly_MJO_ENSO/meridionalOceanCurrent/cfsr_meridionalOceanCurrent_{valid?fmt=%Y%m%d}.nc

# Name of the file containing the listing of input files
# The options are OBS_TAUX_INPUT, OBS_TAUY_INPUT, OBS_SST_INPUT, OBS_UCUR_INPUT, OBS_VCUR_INPUT, FCST_TAUX_INPUT, FCST_TAUY_INPUT, FCST_SST_INPUT, FCST_UCUR_INPUT,and FCST_VCUR_INPUT
# *** Make sure the order is the same as the order of templates listed in USER_SCRIPT_INPUT_TEMPLATE
USER_SCRIPT_INPUT_TEMPLATE_LABELS = OBS_TAUX_INPUT,OBS_TAUY_INPUT, OBS_SST_INPUT, OBS_UCUR_INPUT, OBS_VCUR_INPUT

# Command to run the user script with input configuration file
USER_SCRIPT_COMMAND = {METPLUS_BASE}/parm/use_cases/model_applications/s2s_mjo/UserScript_obsCFSR_obsOnly_MJO_ENSO/mjo_enso_driver.py


# Configurations for the MJO-ENSO analysis script
[user_env_vars]
# Whether to Run the model or obs
RUN_OBS = {OBS_RUN}
RUN_FCST = {FCST_RUN}

# Make OUTPUT_BASE Available to the script
SCRIPT_OUTPUT_BASE = {OUTPUT_BASE}

# Number of obs per day
OBS_PER_DAY = 1

# Variable names for TAUX, TAUY, SST, UCUR, VCUR
OBS_TAUX_VAR_NAME = uflx
OBS_TAUY_VAR_NAME = vflx
OBS_SST_VAR_NAME = sst
OBS_UCUR_VAR_NAME = u
OBS_VCUR_VAR_NAME = v

# EOF Filename
TAUX_EOF_INPUT_FILE = {INPUT_BASE}/model_applications/s2s_mjo/UserScript_obsCFSR_obsOnly_MJO_ENSO/Data/cfs_uflx_eof.nc
TAUY_EOF_INPUT_FILE = {INPUT_BASE}/model_applications/s2s_mjo/UserScript_obsCFSR_obsOnly_MJO_ENSO/Data/cfs_vflx_eof.nc
WMJOK_SST_EOF_INPUT_FILE = {INPUT_BASE}/model_applications/s2s_mjo/UserScript_obsCFSR_obsOnly_MJO_ENSO/Data/cfs_multivarEOF.nc

# Filters weights
TAUX_Filter1_TEXTFILE = {INPUT_BASE}/model_applications/s2s_mjo/UserScript_obsCFSR_obsOnly_MJO_ENSO/Data/taux.filter1.txt
TAUX_Filter2_TEXTFILE = {INPUT_BASE}/model_applications/s2s_mjo/UserScript_obsCFSR_obsOnly_MJO_ENSO/Data/taux.filter2.txt
TAUY_Filter1_TEXTFILE = {INPUT_BASE}/model_applications/s2s_mjo/UserScript_obsCFSR_obsOnly_MJO_ENSO/Data/tauy.filter1.txt
TAUY_Filter2_TEXTFILE = {INPUT_BASE}/model_applications/s2s_mjo/UserScript_obsCFSR_obsOnly_MJO_ENSO/Data/tauy.filter2.txt


# Output Directory for the plots
# If not set, it this will default to {OUTPUT_BASE}/plots
PLOT_OUTPUT_DIR = {OUTPUT_BASE}/s2s_mjo/UserScript_obsCFSR_obsOnly_MJO_ENSO/plots

# MaKE, MaKI indices output file
MAKE_MAKI_OUTPUT_TEXT_FILE = {OUTPUT_BASE}/s2s_mjo/UserScript_obsCFSR_obsOnly_MJO_ENSO/MAKE-MAKI


# Plot start date, end date, output name, and format
PLOT_TIME_BEG = 19900101
PLOT_TIME_END = 20211231
PLOT_TIME_FMT = {VALID_TIME_FMT}
OBS_PLOT_OUTPUT_NAME = MAKE_MAKI_timeseries
OBS_PLOT_OUTPUT_FORMAT = png

MET Configuration

METplus sets environment variables based on the values in the METplus configuration file. These variables are referenced in the MET configuration file. YOU SHOULD NOT SET ANY OF THESE ENVIRONMENT VARIABLES YOURSELF! THEY WILL BE OVERWRITTEN BY METPLUS WHEN IT CALLS THE MET TOOLS! If there is a setting in the MET configuration file that is not controlled by an environment variable, you can add additional environment variables to be set only within the METplus environment using the [user_env_vars] section of the METplus configuration files. See the ‘User Defined Config’ section on the ‘System Configuration’ page of the METplus User’s Guide for more information.

Python Scripts

The MJO-ENSO driver script orchestrates the calculation of the MaKE and MaKI indices and the generation of a text file and a plot for the indices: parm/use_cases/model_applications/s2s_mjo/UserScript_obsCFSR_obsOnly_MJO_ENSO/mjo_enso_driver.py:

#!/usr/bin/env python3

import xarray as xr
import numpy as np
import pandas as pd
import matplotlib.pyplot as plt
import os
import datetime
import warnings

import metcalcpy.contributed.mjo_enso.compute_mjo_enso as mj
import metplotpy.contributed.mjo_enso.plot_mjo_enso_indices as plt
import METreadnc.util.read_netcdf as read_netcdf


def read_eofs(taux_eofs_file, tauy_eofs_file, meofs_file):
     
    taux_eofs=xr.open_dataset(taux_eofs_file).eof
    tauy_eofs=xr.open_dataset(tauy_eofs_file).eof
    meofs = xr.open_dataset(meofs_file).meofs

    return taux_eofs,tauy_eofs,meofs 

def read_filters(filtx1fil,filtx2fil,filty1fil,filty2fil):
    filtx1=np.loadtxt(filtx1fil, delimiter=',')
    filtx2=np.loadtxt(filtx2fil, delimiter=',')
    filty1=np.loadtxt(filty1fil, delimiter=',')
    filty2=np.loadtxt(filty2fil, delimiter=',')

    return filtx1,filtx2,filty1,filty2

def run_mjo_enso_steps(inlabel,spd,filtx1,filtx2,filty1,filty2,taux_eofs,tauy_eofs,meofs,oplot_dir):
    
    # Get TAUX, TAUY, SST, UCURRENT, VCURRENT file listings and variable names
    taux_filetxt = os.environ['METPLUS_FILELIST_'+inlabel+'_TAUX_INPUT']
    tauy_filetxt = os.environ['METPLUS_FILELIST_'+inlabel+'_TAUY_INPUT']
    sst_filetxt = os.environ['METPLUS_FILELIST_'+inlabel+'_SST_INPUT']
    ucur_filetxt = os.environ['METPLUS_FILELIST_'+inlabel+'_UCUR_INPUT']
    vcur_filetxt = os.environ['METPLUS_FILELIST_'+inlabel+'_VCUR_INPUT']

    taux_var = os.environ[inlabel+'_TAUX_VAR_NAME']
    tauy_var = os.environ[inlabel+'_TAUY_VAR_NAME']
    sst_var = os.environ[inlabel+'_SST_VAR_NAME']
    u_var = os.environ[inlabel+'_UCUR_VAR_NAME']
    v_var = os.environ[inlabel+'_VCUR_VAR_NAME']

    # Read the listing of TAUX, TAUY, SST, UCUR, VCUR files
    with open(taux_filetxt) as tx:
        taux_input_files = tx.read().splitlines()
    if (taux_input_files[0] == 'file_list'):
        taux_input_files = taux_input_files[1:]
    with open(tauy_filetxt) as ty:
        tauy_input_files = ty.read().splitlines()
    if (tauy_input_files[0] == 'file_list'):
        tauy_input_files = tauy_input_files[1:]
    with open(sst_filetxt) as ts:
        sst_input_files = ts.read().splitlines()
    if (sst_input_files[0] == 'file_list'):
        sst_input_files = sst_input_files[1:]
    with open(ucur_filetxt) as uc:
        ucur_input_files = uc.read().splitlines()
    if (ucur_input_files[0] == 'file_list'):
        ucur_input_files = ucur_input_files[1:]
    with open(vcur_filetxt) as vc:
        vcur_input_files = vc.read().splitlines()
    if (vcur_input_files[0] == 'file_list'):
        vcur_input_files = vcur_input_files[1:]

    # Check the input data to make sure it's not all missing
    taux_allmissing = all(elem == 'missing' for elem in taux_input_files)
    if taux_allmissing:
        raise IOError ('No input TAUX files were found, check file paths')
    tauy_allmissing = all(elem == 'missing' for elem in tauy_input_files)
    if tauy_allmissing:
        raise IOError('No input TUAY files were found, check file paths')
    sst_allmissing = all(elem == 'missing' for elem in sst_input_files)
    if sst_allmissing:
        raise IOError('No input SST files were found, check file paths')
    ucur_allmissing = all(elem == 'missing' for elem in ucur_input_files)
    if ucur_allmissing:
        raise IOError('No input UCUR files were found, check file paths')
    vcur_allmissing = all(elem == 'missing' for elem in vcur_input_files)
    if vcur_allmissing:
        raise IOError('No input VCUR files were found, check file paths')
 
    netcdf_reader_taux=read_netcdf.ReadNetCDF()
    ds_taux=netcdf_reader_taux.read_into_xarray(taux_input_files)

    netcdf_reader_tauy=read_netcdf.ReadNetCDF()
    ds_tauy=netcdf_reader_tauy.read_into_xarray(tauy_input_files)

    netcdf_reader_sst=read_netcdf.ReadNetCDF()
    ds_sst=netcdf_reader_sst.read_into_xarray(sst_input_files)

    netcdf_reader_ucur=read_netcdf.ReadNetCDF()
    ds_ucur=netcdf_reader_ucur.read_into_xarray(ucur_input_files)

    netcdf_reader_vcur=read_netcdf.ReadNetCDF()
    ds_vcur=netcdf_reader_vcur.read_into_xarray(vcur_input_files)

    time = []
    for din in range(len(ds_taux)):
        ctaux = ds_taux[din]
        #ctime =  datetime.datetime.strptime(ctaux[taux_var].valid_time,'%Y%m%d_%H%M%S')
        ctime =  datetime.datetime.strptime(str(ctaux['time'][0].values)[0:10],'%Y-%m-%d')
        time.append(ctime.strftime('%Y-%m-%d'))
        #ctaux = ctaux.assign_coords(time=ctime)
        #ds_taux[din] = ctaux.expand_dims("time")

        ctauy = ds_tauy[din]
        #ctauy = ctauy.assign_coords(time=ctime)
        #ds_tauy[din] = ctauy.expand_dims("time")

        csst = ds_sst[din]
        #csst = csst.assign_coords(time=ctime)
        #ds_sst[din] = csst.expand_dims("time")

        cucur = ds_ucur[din]
        #cucur = cucur.assign_coords(time=ctime)
        #ds_ucur[din] = cucur.expand_dims("time")

        cvcur = ds_vcur[din]
        #cvcur = cvcur.assign_coords(time=ctime)
        #ds_vcur[din] = cvcur.expand_dims("time")

    time = np.array(time,dtype='datetime64[D]')

    everything_taux = xr.concat(ds_taux,"time")
    uflxa = everything_taux[taux_var]

    everything_tauy = xr.concat(ds_tauy,"time")
    vflxa = everything_tauy[tauy_var]

    everything_sst = xr.concat(ds_sst,"time")
    sst = everything_sst[sst_var]

    everything_ucur = xr.concat(ds_ucur,"time")
    u = everything_ucur[u_var]

    everything_vcur = xr.concat(ds_vcur,"time")
    v = everything_vcur[v_var]
    print(v.shape)
  
    # get taux_mjo and tauy_mjo

    uflx_mjo=mj.calc_tau_MJO(uflxa,taux_eofs,filtx1,filtx2)
    vflx_mjo=mj.calc_tau_MJO(vflxa,tauy_eofs,filty1,filty2)
   
    wpower=mj.calc_wpower_MJO(u,v,uflx_mjo,vflx_mjo)

    #sst = ds.sst.sel(lat=slice(-5,5)).mean(dim='lat',skipna=True)
    sst = sst.sel(lat=slice(-5,5)).mean(dim='lat',skipna=True)

    wmjoks = wpower.sel(lat=slice(-5,5)).mean(dim='lat',skipna=True)

    make,maki=mj.make_maki(sst,wmjoks,meofs)

    #Get the index output file
    index_file = os.environ['MAKE_MAKI_OUTPUT_TEXT_FILE']
    import csv
    date_format = '%Y-%m-%d'
    strDate=datetime.datetime.strptime(str(sst['time'][0].values)[0:10],date_format)
    endDate=datetime.datetime.strptime(str(sst['time'][-1].values)[0:10],date_format) 
    time_mon = pd.date_range(strDate, endDate, freq='MS')#.to_pydatetime().tolist()
    with open(index_file+'.csv', 'w', newline='') as file:
        writer = csv.writer(file)
        writer.writerow(["Date", "MaKE", "MaKI"])
        for i in range(len(make)):
            writer.writerow([time_mon[i], make[i].data, maki[i].data])
    
    #Get times for plotting MaKE and MaKI indices
    plot_time_format = os.environ['PLOT_TIME_FMT'] 
    plot_start_time = datetime.datetime.strptime(os.environ['PLOT_TIME_BEG'],plot_time_format)
    plot_end_time = datetime.datetime.strptime(os.environ['PLOT_TIME_END'],plot_time_format)    

    make_plot = make.sel(time=slice(plot_start_time,plot_end_time))
    maki_plot = maki.sel(time=slice(plot_start_time,plot_end_time))

    # Get the output name and format for the MaKE and MaKi plot
    plot_name = os.path.join(oplot_dir,os.environ.get(inlabel+'_PLOT_OUTPUT_NAME',inlabel+'_MAKE_MAKI_timeseries'))
    plot_format = os.environ.get(inlabel+'_PLOT_OUTPUT_FORMAT','png')

    #plot the MaKE-MaKI indices
    plt.plot_make_maki(make_plot,maki_plot,np.array(make_plot['time'].values),plot_name,plot_format)
    

def main():
    
    # Get the EOF files
    taux_eofs_file = os.environ['TAUX_EOF_INPUT_FILE'] 
    tauy_eofs_file = os.environ['TAUY_EOF_INPUT_FILE'] 
    meofs_file = os.environ['WMJOK_SST_EOF_INPUT_FILE'] 

    # Read in the EOFS
    print('Reading the EOFs')
    taux_eofs,tauy_eofs,meofs = read_eofs(taux_eofs_file, tauy_eofs_file, meofs_file)
    print('Done with reading EOFs')

    #Get the filter weights files
    filtx1fil = os.environ['TAUX_Filter1_TEXTFILE']
    filtx2fil = os.environ['TAUX_Filter2_TEXTFILE']
    filty1fil = os.environ['TAUY_Filter1_TEXTFILE']
    filty2fil = os.environ['TAUY_Filter2_TEXTFILE']
    
    # Read in the weights of the filters
    filtx1,filtx2,filty1,filty2 = read_filters(filtx1fil,filtx2fil,filtx2fil,filty2fil)

    # Get Number of Obs per day
    spd = os.environ.get('OBS_PER_DAY',1)

   # Check for an output plot directory
    oplot_dir = os.environ.get('PLOT_OUTPUT_DIR','')
    if not oplot_dir:
        obase = os.environ['SCRIPT_OUTPUT_BASE']
        oplot_dir = os.path.join(obase,'plots')
    if not os.path.exists(oplot_dir):
        os.makedirs(oplot_dir) 
     
   # Determine if doing forecast or obs
    run_obs_mjo_enso = os.environ.get('RUN_OBS', 'False').lower()
    run_fcst_mjo_enso = os.environ.get('RUN_FCST', 'False').lower()

    if (run_obs_mjo_enso == 'true'):
        run_mjo_enso_steps('OBS', spd, filtx1, filtx2, filty1, filty2, taux_eofs, tauy_eofs, meofs,oplot_dir)

    if (run_fcst_mjo_enso == 'true'):
        run_mjo_enso_steps('FCST', spd, filtx1, filtx2, filty1, filty2, taux_eofs, tauy_eofs, meofs,oplot_dir)

    # nothing selected
    if (run_obs_mjo_enso == 'false') and (run_fcst_mjo_enso == 'false'):
        warnings.warn('Forecast and Obs runs not selected, nothing will be calculated')
        warnings.warn('Set RUN_FCST or RUN_OBS in the [user_en_vars] section to generate output')

if __name__ == "__main__":
    main()

Running METplus

This use case is run in the following ways:

  1. Passing in UserScript_obsCFSR_obsOnly_MJO_ENSO.conf then a user-specific system configuration file:

    run_metplus.py -c /path/to/METplus/parm/use_cases/model_applications/s2s_mjo/UserScript_obsCFSR_obsOnly_MJO_ENSO.conf -c /path/to/user_system.conf
    
  2. Modifying the configurations in parm/metplus_config, then passing in UserScript_obsCFSR_obsOnly_MJO_ENSO.py:

    run_metplus.py -c /path/to/METplus/parm/use_cases/model_applications/s2s_mjo/UserScript_obsCFSR_obsOnly_MJO_ENSO.conf
    

The following variables must be set correctly:

  • INPUT_BASE - Path to directory where sample data tarballs are unpacked (See Datasets section to obtain tarballs). This is not required to run METplus, but it is required to run the examples in parm/use_cases

  • OUTPUT_BASE - Path where METplus output will be written. This must be in a location where you have write permissions

  • MET_INSTALL_DIR - Path to location where MET is installed locally

Example User Configuration File:

[dir]
INPUT_BASE = /path/to/sample/input/data
OUTPUT_BASE = /path/to/output/dir
MET_INSTALL_DIR = /path/to/met-X.Y

Expected Output

Refer to the value set for OUTPUT_BASE to find where the output data was generated. Output for this use case will be found in model_applications/s2s_mjo/UserScript_obsCFSR_obsOnly_MJO_ENSO. This may include the regridded data. In addition, a text (.csv) file will be generated and a time serie plot. The name of the text file can be specified as MAKE_MAKI_OUTPUT_TEXT_FILE. The output location can be specified as PLOT_OUTPUT_DIR. If it is not specified, plot will be sent to model_applications/s2s_mjo/UserScript_obsCFSR_obsOnly_MJO_ENSO/plots (relative to OUTPUT_BASE). The name of the plot file can be specified as OBS_PLOT_OUTPUT_NAME.

Keywords

Note

  • S2SAppUseCase

  • S2SMJOAppUseCase

  • NetCDFFileUseCase

  • RegridDataPlaneUseCase

  • PCPCombineUseCase

  • METcalcpyUseCase

  • METplotpyUseCase

Navigate to METplus Quick Search for Use Cases to discover other similar use cases.

sphinx_gallery_thumbnail_path = ‘_static/s2s_mjo-UserScript_obsCFSr_obsOnly_MJO_ENSO.png’

Total running time of the script: (0 minutes 0.000 seconds)

Gallery generated by Sphinx-Gallery