Journal Club: 18-04-2017

Article Title: Evaluation of the reanalysis surface incident shortwave radiation products from NCEP, ECMWF, GSFC, and JMA using satellite and surface observations. Remote Sensing 8.3 (2016)

This paper evaluates the solar radiation incident at the Earth’s surface (Rs) for six global reanalyses (NCEP–NCAR, NCEP-DOE; CFSR; ERA-Interim; MERRA; and JRA-55) using surface measurements from different observation networks (GEBA; BSRN; GC-NET; Buoy; and CMA) and the Earth’s Radiant Energy System (CERES) EBAF product from 2001 to 2009. For surface measurements which have instantaneous values of Rs, the daily integrated Rs was obtained from the instantaneous values through a sinusoidal interpolation method.

All selected reanalysis Rs products overestimated monthly Rs when compared to the surface measurements from the five networks. Most reanalysis products showed better accuracy in DJF than that in JJA. Reanalysis was also compared to CERES-EBAF. Almost all of the reanalysis overestimated the monthly Rs over land, the oceans, and the globe. Bias and RMSE values of global reanalyses compared with CERES-EBAF Rs data were smaller than those between the global reanalyses and surface observations.

We could look at using GEBA and CERES for evaluation of MERRA2 and ERA-Interim and compare the results here. These might be sources of solar radiation data for the UK (if we expand our research area).


Satellite Data for Solar Energy Applications

As the use of solar energy is constantly growing in Ireland, it has become increasingly important to assess the accuracy and skill of our weather forecasts and models. Satellites can be used on short time-scales to verify incoming solar radiation forecasts and also potentially to update operational PV models. This report outlines appropriate satellite data and the relevant retrieval methods for examining solar energy processes in Ireland.

There are two main categories of satellites: geostationary and polar orbiting satellites. Polar orbiting satellites orbit around the earth longitudinally passing directly over the poles at an altitude of about 850km, taking approximately 100 minutes to complete the orbit. During one half of the orbit, the satellite views the daytime side of the Earth. At the pole, the satellite crosses over to the night-time side of Earth. Our assistant is ready to help you with football betting bonus. Due to the earth rotating underneath the orbiting satellite with each orbit the satellite viewing area is an area west of the previous image. Polar orbiting satellites will view most of the earth at least twice in a 24-hour period.

Geostationary satellites orbit the earth over the equator at an altitude of about 36,000km. These satellites have a lower resolution than polar orbiting satellites due to their higher altitude. However, they record a continuous full disk image of the same region in every image which is advantageous for observing the weather features occurring. A down-side of using geostationary satellite images for Ireland is our high latitude which causes a perspective distortion in the images as the satellite is located over the equator. Nevertheless, this report found geostationary satellite data to be the most suitable for solar energy processes in Ireland.

This report looks at the different formats of satellite data; raw satellite data, derived satellite products and satellite images. The most appropriate source and format found was geostationary Meteosat images from the European Organisation for the Exploitation of Meteorological Satellites (EUMETSAT) Data Centre. The images have a temporal resolution of 15 minutes and the data can be easily downloaded from the browser or retrieved via wget.

Satellite Data for Solar Energy


Dealing with GRIB2 data

We use python for our data analysis and plotting.

The wonderful xarray package ( has no problem with NetCDF, but doesn’t like GRIB2. GFS data are in GRIB2, however, so it would be nicve if we could convert them easily to NetCDF.

We can do this by using the great wgrib2 tool (

To install wgrib2 (on ORR2 or SONIC, for example), do the following:

  1. Make a directory for wgrib2 and move there:
    mkdir wgrib2
    cd wgrib2
  2. Get wgrib2 and unpack it:
    tar -xzf wgrib2.tgz
  3. Move into the directory, set environment variables and make wgrib2:
    cd grib2/
    export CC=gcc
    export FC=gfortran
  4. Now you can convert a GFS GRIB2 file to NetCDF!
    wgrib2/grib2/wgrib2/wgrib2 gec00.t12z.pgrb2f00 -netcdf

    It’d be a good idea to add the wgrib2 executable directory to your PATH, so you can call it from any directory.


Download GFS forecast

We would like to run WRF forecasts driven by global forecast data from NOAA’s Global Forecast System (GFS:

To download a subset of the GFS data use NOMADS g2subs:

  • Select the “grib filter” link for the dataset you want.
  • Use “make subregion”: e.g. lon:-55 to 25, lat: 25 to 70
  • Don’t use the analysis data file – it doesn’t contain soil data.
  • You can see which fields are in the GFS file by using the g2print.exe command in the WPS/util directory.
  • Using all levels, all variables, and subregion (lon:-55 to 25, lat: 25 to 70) reduced the size of the analysis file from 176M to 24M. The file size becomes even smaller by requesting less variables.
  • If you want to download a lot of data, you should tick the “Show the URL only for web programming” box, and write a bash script to change the URL for the dates you want, with variable output file name: curl $URL -o ${EM}${FT}.grb

How do you write that bash script?

Say you end up with a URL like this:


This code contains different parts:

  • file=gfs.t12z.pgrb2.0p25.f000 – this is the base file name: model GFS, forecast initial time 12Z, …, forecast hour is 000. The forecast hour is probably what we want to change.
  • dir=%2Fgfs.2017041712 – this is the year, month, day and hour of the forecast initial time. All forecast data are inside this directory.
  • We don’t want to change the rest of the URL: forecast model, level, variables, subregion. If you want to change these, go through the grib_filter web page again and generate a new URL.

If we want to download multiple forecast hours for the same forecast initial time, we could do so with a bash script like this:


# set the base URL in two parts: URL1 and URL2
# leave our forecast hour:



# Let forecast hour vary from 0 to 24.
# It needs to have three digits, so we start with 1000:

for i in {0..24}
  echo $i
  TFCH=`expr 1000 + $i`
  FCH=`echo $TFCH | cut -c2-4`
  curl $URL -o GFS${FCH}.grb

Note: make sure you are using backticks (`) not single quotes (‘) on the TFCH= and FCH= lines. Single quotes don’t evaluate anything in between them. Backticks run the command between the backticks and return the result, which is what we want here.

Instead of typing these commands into the terminal, you can simply save all of the above into a file, say “getGFS.bash” and then in the terminal type:

bash getGFS.bash