Using conda install on a server

conda is great, it makes installing lots of packages easy. However, if you have an account on a server which already has conda installed, you should use conda to only install packages for your own user, not system-wide. To do this, you should make use of conda environments.

For example, let’s say you want to use cdo on your server. First, create a conda environment, I’ve called it cdo-env:

conda create -y -n cdo-env

Now activate your environment, so you can work within it:

conda activate cdo-env

When you’re in the environment, the shell prompt may include the name of the environment, as a handy reminder:

(cdo-env) [sweeneyc@login ~]$

Now you can install cdo within your cdo-env environment:

conda install -c conda-forge cdo

If all goes well, this should install cdo within your cdo-env enviroment, and you can use it whenever you like!

To close/exit your conda environment, use the command:

conda deactivate

That’s it!

Downloading ERA5 data

Install Climate Data Store API on linux:

I’ve installed mine on the mccdata server in UCD.

The great thing about using a script, instead of downloading directly from the CDS, is that it gives you more control. The ERA5 CDS page for ERA5 monthly averaged data on single levels from 1979 to present doesn’t let you change the grid or domain. I wanted data at a lower resolution, and only for 20-90N. I was able to specify this in my retrieval script:

import cdsapi

c = cdsapi.Client()

c.retrieve(
    'reanalysis-era5-single-levels-monthly-means',
    {
        'product_type':'monthly_averaged_reanalysis',
        'variable':'mean_sea_level_pressure',
        'year':[
            '1979','1980','1981',
            '1982','1983','1984',
            '1985','1986','1987',
            '1988','1989','1990',
            '1991','1992','1993',
            '1994','1995','1996',
            '1997','1998','1999',
            '2000','2001','2002',
            '2003','2004','2005',
            '2006','2007','2008',
            '2009','2010','2011',
            '2012','2013','2014',
            '2015','2016','2017',
            '2018','2019'
        ],
        'month':[
            '01','02','03',
            '04','05','06',
            '07','08','09',
            '10','11','12'
        ],
        'time':'00:00',
        'area':[90, -180, 20, 180], # North, West, South, East. Default: global
        'grid':[1.0, 1.0], # Latitude/longitude grid: east-west (longitude) and north-south resolution (latitude). Default: 0.25 x 0.25
        'format':'netcdf'
    },
    'ERA5_mslp.nc')

Then I ran this from the command line:

python ERA5.mslp.CDS

And this quickly retrieved the required data!

Easy way to install cdo!

There is a wonderfully easy way to install cdo, which takes care of all the libraries for you, and you can do it within your user directory, so don’t need special privileges: use conda!

conda config --add channels conda-forge
conda update --all
conda install cdo
cdo

https://code.mpimet.mpg.de/projects/cdo/wiki/Anaconda

This cdo seems to work happily with GRIB and NetCDF data, and also installs lots of other useful tools, in particular

grib_to_netcdf

To convert IFS from GRIB to NetCDF, I think the best command to use may be:

grib_to_netcdf -T /pithos/shared/ESIPP/ECMWF/IFS/IFS_IT00_20180101_20180107.grib -o IFS.cdf

The “-T” option is good here, as it converts the data using the dimensions:

longitude = 80 ;
latitude = 58 ;
step = 49 ;
date = 7 ;

where date is when the forecast started (initialisation time, IT), and step is the forecast time (0 to 48).

Fantastic!

WRF driven by GFS Ensemble data on ORR2

GFS Ensemble data are available to download from here:

  • https://www.ncdc.noaa.gov/data-access/model-data/model-datasets/global-ensemble-forecast-system-gefs
  • Grid GENS 3
  • Start Date = End Date. Select files. I’m assuming that _00 is the “control” forecast, and _01 to _20 are the perturbed ensemble members.
  • When your data are ready to download, you will get an email notification. Click on the “Web Download” link. Each ensemble forecast member will have a link that looks something like this:
    gens_3_2017101200_00.g2.tar 03-Apr-2018 07:06 1.6G
    Right-click on the file and copy the link address. Now ssh into the ORR2 computer. Create a folder for this data, like GRIB/GENS/GENS00, and use wget with your copied link to download the data file.
  • Unpack your forecast data file using the tar command, e.g.:
    tar -xf gens_3_2017101200_12.g2.tar
  • You should now have lots of .grb2 files in that directory. You’re ready to run WRF.

The steps to run WRF driven by GENS are similar to those to run WRF with GFS, so follow this post: https://metclim.ucd.ie/2018/02/wrfv3-9-1-test-case-on-orr2/. A few differences to note:

  • link Vtable to ungrib/Variable_Tables/Vtable.GFSENS
  • ./link_grib.csh to both the gens-a and gens-b files.
  • In namelist.input, use: num_metgrid_levels = 27,

If all goes well, you should be able to run your WRF forecast. WRF doesn’t output MSLP, but you can use a code like this to calculate and plot it: