Taylor Diagrams and high resolution plots of land use and elevation.

This is an overview for producing these following plots.

Taylor Diagram

The Taylor diagram graphically illustrates how well a model represents the variability of the observations it is attempting to predict, in terms of correlation, standard deviation and RMSE.

This code was adapted from the python file in this GitHub repository to suit our needs, credit goes to them.

This involves the writing of two python functions, one for generating circles on a graph and one that pulls all the data together, draws the appropriate circles and produces the plot. These are shown in the code blocks below.

def drawCircle(radius, xcenter=0, ycenter=0, maxradius=np.inf,style="--", color="k", lw=1, label=""):
    angles = np.linspace(-np.pi / 2, np.pi / 2, 361)
    x = np.sin(angles) * radius + xcenter
    y = np.cos(angles) * radius + ycenter

  # Only keep points within the circle
    I = np.where(x ** 2 + y ** 2 < maxradius ** 2)[0]
    if(len(I) == 0):
    x = x[I]
    y = y[I]
    plt.plot(x, y, style, color=color, lw=lw, zorder=-100, label=label)
    plt.plot(x, -y, style, color=color, lw=lw, zorder=-100)
def makeTaylor(corrs,stds,stdobs,circ_LIM=1.0,loc='location'):
    '''This function takes a list of correlations and standard deviations 
       corresponding to 6 different model outputs in this case (should alter 
       the code for your application), along with the standard deviation
       of the observations. circ_LIM determines the radius of the outer
       circle in the plot and loc is a name identifier (like station name)
       for plot titles and filenames for saving plots

    plot_colors = ['blue','purple','brown','orange','green','red']
    ang = np.arccos(corrs)
    x = stds * np.cos(ang)
    y = stds * np.sin(ang)
    fig, axs = plt.subplots(1, 1,figsize=(8,8))

    plt.scatter(x[:3], y[:3], marker='o',s=120, color=plot_colors[:3],edgecolor='k', label=plot_cols)
    plt.scatter(x[3:], y[3:], marker='X',s=180, color=plot_colors[3:],edgecolor='k', label=plot_cols)

    xticks = np.arange(0,circ_LIM,0.1)

    drawCircle(stdobs, xcenter=0, ycenter=0, style="--", color="cyan", lw=3)

    plt.xticks(xticks[xticks >= 0])
    plt.text(np.sin(np.pi / 4) * circ_LIM, np.cos(np.pi / 4) * circ_LIM,
        "Correlation", rotation=-45,horizontalalignment="center", verticalalignment="bottom")

    plt.plot(stdobs, 0, 's-', color='cyan', label="Observation", mew=2, clip_on=False)

    corrs = [0, 0.5, 0.8, 0.9, 0.95,
    for i in range(0, len(corrs)):
        angl = np.arccos(corrs[i]) 
        xl = np.cos(angl) * circ_LIM
        yl = np.sin(angl) * circ_LIM
        plt.plot([0, xl], [0, yl], 'k--')
        plt.text(xl, yl, str(corrs[i]), verticalalignment="bottom")

    xticks = plt.xticks()[0]

    Rs = np.linspace(0.1, 1.0, 10)
    for R in Rs:
        if(R > 0):
            drawCircle(R, xcenter=stdobs, ycenter=0, maxradius=circ_LIM, style="-", color="gray", lw=3)
            xc = np.sin(-np.pi / 4) * R + stdobs
            yc = np.cos(np.pi / 4) * R
            if(xc >0):
                plt.text(xc, yc, str(R)[:3], horizontalalignment="right",

    for X in plt.xticks()[0]:
        if(X <= circ_LIM):
            drawCircle(X, style=":")
    drawCircle(circ_LIM, style="-", lw=3)



    plt.savefig('plots/'+loc.replace(' ','_')+'_TaylorWind',dpi=300)

This will produce a plot like this:

Corine Land use data

The Corine land use dataset can be downloaded from the EPA website. This data comes in the form of a .shp file which I had not encountered before. A quick google suggested that the library geopandas would allow me to analyse the data. The data are comprised of 18882 rows × 8 columns. The important column is “Class_Desc” which gives the names of the land use classifiers.

import geopandas as gpd
import matplotlib.pyplot as plt

cr = gpd.read_file('/CLC18_IE_ITM/CLC18_IE_ITM.shp')


This produced a map of the ROI shown below.

The colouring represents each of the 35 land use categories in the dataset.

Looking into the data further, it becomes apparent that the data is mapped in the Irish Transverse Mercator projection on an easting and northing grid. It is possible to zoom in on certain areas by indexing the data appropriately, and you can get lat-lon coordinates converted using this handy tool from OSI. An example below zooming in on Kerry and dropping pins (black crosses) at two location in Lerrig and Dromnacurra in the north of the county.

import geopandas as gpd
import matplotlib.pyplot as plt

cr = gpd.read_file('CLC18_IE_ITM/CLC18_IE_ITM.shp')

fig, ax = plt.subplots(1, 1,figsize=(10,10))


#Markers, with values converted from OSI tool

#Arbitrary plot limits


Digital Elevation Map (DEM)

One option to download high resolution elevation data is outlined here in this handy tutorial. You will need to install the following libraries.

$ conda create -n gdal_test
$ conda activate gdal_test
$ conda install gdal
$ conda install elevation

These allow you to both download and plot DEM data. The follow should be entered in the command line to download your data (obviously change your bounding box to suit your application). I have chosen to focus on Kerry (again).

$ !eio clip -o Kerry-30m-DEM.tif --bounds -10.6 51. -8.9 53. 

This will download the .tif that you have named in your command above. Now you can use gdal within python to visualise the data.

from osgeo import gdal
import numpy as np
import matplotlib 
import matplotlib.pyplot as plt

filename = "Kerry-30m-DEM.tif"
gdal_data = gdal.Open(filename)
gdal_band = gdal_data.GetRasterBand(1)
nodataval = gdal_band.GetNoDataValue()

# convert to a numpy array
data_array = gdal_data.ReadAsArray().astype(np.float)

# replace missing values if necessary
if np.any(data_array == nodataval):
    data_array[data_array == nodataval] = np.nan

#Plot out data with Matplotlib's 'contour'
fig = plt.figure(figsize = (12, 8))
ax = fig.add_subplot(111)

## I reversed the data below using [-1::-1] because the data was 
## initially plotting upside down! This fixed it.
plt.contourf(data_array[-1::-1,:], cmap = "viridis", 
            levels = list(range(0, 1050, 50)))

plt.title("Elevation Contours over Kerry")
cbar = plt.colorbar()

Kerry does not have the same terrain complexity as Mt Shasta in the tutorial, so remember to adjust your contouring levels accordingly.

These data are mapped on a “World Geodetic System 1984” projection.


Using conda install on a server

conda is great, it makes installing lots of packages easy. However, if you have an account on a server which already has conda installed, you should use conda to only install packages for your own user, not system-wide. To do this, you should make use of conda environments.

For example, let’s say you want to use cdo on your server. First, create a conda environment, I’ve called it cdo-env:

conda create -y -n cdo-env

Now activate your environment, so you can work within it:

conda activate cdo-env

When you’re in the environment, the shell prompt may include the name of the environment, as a handy reminder:

(cdo-env) [sweeneyc@login ~]$

Now you can install cdo within your cdo-env environment:

conda install -c conda-forge cdo

If all goes well, this should install cdo within your cdo-env enviroment, and you can use it whenever you like!

To close/exit your conda environment, use the command:

conda deactivate

That’s it!


Downloading ERA5 data

Install Climate Data Store API on linux:

I’ve installed mine on the mccdata server in UCD.

The great thing about using a script, instead of downloading directly from the CDS, is that it gives you more control. The ERA5 CDS page for ERA5 monthly averaged data on single levels from 1979 to present doesn’t let you change the grid or domain. I wanted data at a lower resolution, and only for 20-90N. I was able to specify this in my retrieval script:

import cdsapi

c = cdsapi.Client()

        'area':[90, -180, 20, 180], # North, West, South, East. Default: global
        'grid':[1.0, 1.0], # Latitude/longitude grid: east-west (longitude) and north-south resolution (latitude). Default: 0.25 x 0.25

Then I ran this from the command line:

python ERA5.mslp.CDS

And this quickly retrieved the required data!


June 2019 conferences: WESC and ICEM

I went to WESC (Wind Energy Science Conference) in UCC, where I presented a talk “A multivariate spatial post-processing method for day-ahead wind energy forecasts” in the forecasting & resource assessment session which was part of theme 1: wind resource, turbulence and wakes.

I also attended some interesting talks on wind energy resources and planning. I learnt how windfarms interact with each other especially if they are less than 40km apart. Also, that crop location/type can change the temperature and consequently the wind profile of the area. There is a temperature increase near wind farms especially at night, due to warming near the turbines, which can lead to cooling further downwind.

There was also an interesting discussion on “Does the price of a windfarm location depend on the proximity to adjacent windfarms (whether planned or existing)?”. There was no conclusion (and no law/rules) but there might be occasions when some contracts get compensation at certain times.

I also attended ICEM (International Conference Energy & Meteorology) in DTU in Lyngby, near Copenhagen in Denmark.

I presented a short talk “Forecasting Matters” at the inaugural WattMeet, a networking and presentation reception event the evening before the conference began.

I also presented a poster “Adaptive spatial post-processing to reduce systematic error in day-ahead renewable energy forecasts” as part of the “wind energy, combined energy and others” poster session.

I also heard some interesting talks, including “Improved very short term spatio-temporal wind forecasting using atmospheric regimes” by Jethro Browell. The talk was about finding a relationship with spatio -temporal structures on shorted time scales. He used a VAR method which adjusts depending on the weather regime present.