Computational Science Community Wiki

Running WPS and creating model domains

This page is for information on setting up model domains, and running WPS to create the meteorology files for those domains.

Setting up model domain

The WRF domain wizard website is, unfortunately, no longer active. You can use an NCL script, however, to test the settings you wish to use for WPS, to ensure that your domain is set up correctly.

Running WPS on ARCHER

Running geogrid

Running WPS in parallel must be done on the compute nodes, which cannot access the /nerc/ space, so:

1. Move WPS executables and input files to new WPS directory on /work/ space

2. Create a softlink to the geographical files in Doug's space, or copy the directory across to the WPS root directory:

ln -s /work/n02/n02/lowe/WPS/WPS_geog_V3.4/geog geog_dir

or

cp -r /work/n02/n02/lowe/WPS/WPS_geog_V3.4/geog geog_dir

3. In the &geogrid section of namelist.wps add the following line to set the geog_data_path to geog_dir:

geog_data_path = 'geog_dir',

or if this doesn't work, point to Doug's data or where you've copied it to:

geog_data_path = '/work/n02/n02/lowe/WPS/WPS_geog_v3.4/geog/',

or

geog_data_path = '/work/n02/n02/[user]/WPS/WPS_geog_v3.4/geog/',

If the geogrid directory has not been copied to the working directory from where WPS was compiled then add the following line to set the link to the geogrid tables

opt_geogrid_tbl_path = '/work/n02/n02/lowe/WPS/tables/geogrid/',

otherwise it defaults to

opt_geogrid_tbl_path = '/work/n02/n02/user/WPS/geogrid/',

(if this is the case, similarly also add the following line to the &metgrid section: opt_metgrid_tbl_path = '/work/n02/n02/lowe/WPS/tables/metgrid/',)

4. don't forget to edit the run_geogrid_par.sh, replacing #PBS -A n02-weat with #PBS -A n02-chem, if on the NCAS Composition allocation, rather than Weather

5. submit the job to the long queue or short queue (runtime < 20 mins, 9:00-17:00 Mon - Fri)

qsub -q short run_geogrid_par.sh or

qsub -q long run_geogrid_par.sh

6. check job has been submitted using: qstat -u username

7. check that the final line of the geogrid.log is

!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!

!  Successful completion of geogrid.        !

!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!

Running ungrib for grib1 format ECMWF data (needed for periods after January 2015)

A. First generate the Surface data:

1. Create a Vtable softlink to the ECMWF surface Vtable:

ln -s alt_Vtables/Vtable.ECSFC Vtable

2. Run link_grib.csh to tell ungrib where the surface files are. link_grib.csh links the downloaded grib files to names in the format GRIBFILE.AAA e.g. if the files are prefixed sfc in directory /met_data/20150710/sfc/, then

./link_grib.csh met_data/20150710/sfc/sfc* (or ./link_grib.csh met_data/20150710/sfc/)

3. I've had difficulty running ungrib as a serial job so have created a script for running it in parallel based on the template for geogrid, so have used this:

#
#PBS -l select=1
#PBS -l walltime=00:20:00
#PBS -A n02-chem
#PBS -N ungrib

cd $PBS_O_WORKDIR

# UNGRIB
aprun -n 4 -N 4 ./ungrib.exe 2>&1 | tee ungrib.log

4. Run ungrib as a parallel job:

qsub -q short run_ungrib_par.sh

5. Check that the final line of ungrib.log (or screen output) is

!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!

!  Successful completion of ungrib.         !

!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!

This should create the surface files with the prefix as specified in the namelist.wps section, e.g.:

&ungrib
 out_format = 'WPS',
 prefix = 'SFC',

6. Remove the softlinks to the surface GRIBFILES and Vtable

rm GRIBFILE.AA* 
rm Vtable 

B. Next generate the Atmospheric 3D data:

1. Rerun link_grib.csh to tell ungrib where the atmospheric 3D files are. link_grib.csh links the downloaded grib files to names in the format GRIBFILE.AAA e.g. if the files are prefixed pl in directory /met_data/20150710/pl/, then

./link_grib.csh met_data/20150710/pl/

2. Create a Vtable softlink the ECMWF atmospheric variable Vtable:

ln -s alt_Vtables/Vtable.ECATM Vtable

3. edit the prefix in the namelist.wps section, e.g.:

&ungrib
 out_format = 'WPS',
 prefix = '3D',

Rerun ungrib as a parallel job:

qsub -q short run_ungrib_par.sh

6. Check that the final line of ungrib.log (or screen output) is

!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!

!  Successful completion of ungrib.         !

!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!

This should create the surface files with the prefix as specified

Running ungrib for grib2 format GFS or FNL (needed for periods after January 2015)

All the following is subject to checking and should not be followed, or at least relied upon - yet!

(refer to the following for expanded discussion and more details - http://rda.ucar.edu/datasets/ds083.2/docs/WRF_NCEP2)

1. Copy the new Vtable (from http://www2.mmm.ucar.edu/wrf/src/Vtable.GFS_new) to the ungrib directory in WPS (if running before Jan 2015 use the old Vtable.GFS)

scp Vtable.GFS_new user@login.archer.ac.uk:/work/n02/n02/user/WPS/ungrib/Variable_Tables

2. Create a soft link to the new Vtable

ln -s ungrib/Variable_Tables/Vtable.GFS_new Vtable

3. Download the grib2 GFS or FNL files (here suggesting using NCEP GDAS final analysis, from http://rda.ucar.edu/datasets/ds083.3/ - available from Jan 2015 to-date @ 0.25x0.25 degrees, or http://rda.ucar.edu/datasets/ds083.2/ from 1999 to-date @ 1x1 degree) to the met directory Ensure that the start and end dates / times cover the period in the namelist at the required interval: e.g. from midnight on 10th to 6 pm on 12th Aug 2016 every 6 hours if

start_date = '2016-08-10_00:00:00',end_date   = '2016-08-12_18:00:00',interval_seconds = 21600,

4. Link the downloaded grib files to names in the format GRIBFILE.AAA using link_grib.csh

e.g. if the files are prefixed gfs in directory /data/gfs, then ./link_grib.csh /data/gfs/gfs*

5. Run ungrib:

qsub -q short run_ungrib.sh

(or edit the run_ungrib.sh script, replacing ./ungrib.exe with ./ungrib.exe >& ungrib.log (or is it 2>&1|tee ungrib.log ???) so it's not too verbose).

6. Check that the final line of ungrib.log (or screen output) is

!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!

!  Successful completion of ungrib.         !

!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!

Running metgrid

metgrid creates the netcdf met input files from the ungribbed surface and atmospheric 3D data.

1. Create a directory for the metgrid netcdf output files in the WPS root directory

mkdir megrid_output

Edit the &metgrid section of the namelist.wps file to provide input and output details, noting the fg_name variables are the prefixes set in the &ungrib section

 fg_name = '3D', 'SFC'
 io_form_metgrid = 2,
 opt_output_from_metgrid_path = 'metgrid_output/',

2. If the metgrid directory has not been copied to the working directory from where WPS was compiled then add the following line to set the link to the metgrid tables:

opt_metgrid_tbl_path = '/work/n02/n02/lowe/WPS/tables/metgrid/',

3. Set the model levels by adding the following section:

&mod_levs
 press_pa = 201300 , 200100 , 100000 ,
             95000 ,  90000 ,
             85000 ,  80000 ,
             75000 ,  70000 ,
             65000 ,  60000 ,
             55000 ,  50000 ,
             45000 ,  40000 ,
             35000 ,  30000 ,
             25000 ,  20000 ,
             15000 ,  10000 ,
              5000 ,   1000

4. check that the run_metgrid_par.sh script reads similar to the following with static allocation of cores:

#
#PBS -l select=1
#PBS -l walltime=00:20:00
#PBS -A n02-chem
#PBS -N metgrid

### script to run wrf.exe

cd $PBS_O_WORKDIR

# METGRID
aprun -n 4 -N 4 ./metgrid.exe 2>&1 | tee metgrid.log

5. Run metgrid as a parallel job:

qsub -q short run_metgrid_par.sh

6. Check that the final line of metgrid.log (or screen output) is

!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!
!  Successful completion of metgrid.        !
!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!

This should create the netcdf met input files for WRF-Chem in the metgrid_output directory

Old Guide on setting up and running WPS

Example namelist.wps

The WRF Preprocessing programs (WPS) are controlled with the namelist.wps file, e.g.:

&share
 wrf_core = 'ARW',
 max_dom = 1,
 start_date = '2006-06-10_00:00:00', 
 end_date   = '2006-07-31_00:00:00', 
 interval_seconds = 21600,
 io_form_geogrid = 2,
 opt_output_from_geogrid_path = 'geo_output/',
 debug_level = 1000,
/

&geogrid
 parent_id         = 1,
 parent_grid_ratio = 1,
 i_parent_start    = 1,
 j_parent_start    = 1,
 e_we          = 150,
 e_sn          = 180,
 geog_data_res = '30s',
 dx = 10000,
 dy = 10000,
 map_proj =  'lambert',
 ref_lat   = 53.293,
 ref_lon   = -3.262,
 truelat1  = 53.293,
 truelat2  = 53.293,
 stand_lon = -3.262,
 geog_data_path = '/work/n02/n02/sru20/DATA/v3.2/geog',
 opt_geogrid_tbl_path = 'geogrid/',
 ref_x = 75.0,
 ref_y = 90.0,
/

&ungrib
 out_format = 'WPS',
 prefix = 'FILE',
/

&metgrid
 fg_name = 'FILE',
 io_form_metgrid = 2,
 opt_output_from_metgrid_path = 'met_output/',
 opt_metgrid_tbl_path = 'metgrid/',
/

&mod_levs
 press_pa = 201300 , 200100 , 100000 ,
             95000 ,  90000 ,
             85000 ,  80000 ,
             75000 ,  70000 ,
             65000 ,  60000 ,
             55000 ,  50000 ,
             45000 ,  40000 ,
             35000 ,  30000 ,
             25000 ,  20000 ,
             15000 ,  10000 ,
              5000 ,   1000
 /

&domain_wizard
 grib_data_path = 'null',
 grib_vtable = 'null',
 dwiz_name    =defra_10km
 dwiz_desc    =10km grid for DEFRA
 dwiz_user_rect_x1 =3765
 dwiz_user_rect_y1 =620
 dwiz_user_rect_x2 =4189
 dwiz_user_rect_y2 =1032
 dwiz_show_political =true
 dwiz_center_over_gmt =true
 dwiz_latlon_space_in_deg =10
 dwiz_latlon_linecolor =-8355712
 dwiz_map_scale_pct =50.0
 dwiz_map_vert_scrollbar_pos =323
 dwiz_map_horiz_scrollbar_pos =3479
 dwiz_gridpt_dist_km =10.0
 dwiz_mpi_command =null
 dwiz_tcvitals =null
 dwiz_bigmap =Y
/

The key settings which the WRF domain wizard is unlikely to get right (and so you'll have to modify before running WPS) are:

Running the WPS

Prior to running WRF-Chem, the WPS programs must be run to create the input files for real. The three programs are:

  1. geogrid - Interpolates terrestrial data to chosen domain.

  2. ungrib - "ungribs" GRIB files to an intermediate format.

  3. metgrid - Horizontally interpolates ungribbed met data to domain.

To prepare to run these you must have:

  1. the settings files created by the WRF domain wizard:
    • namelist.input
    • namelist.wps
    • nest7grid.parms
  2. the three executable files
  3. the geogrid directory

  4. the metgrid directory

No other files from the WPS compilation directory are needed for actually running the programs.

To run geogrid:

  1. the geography data must be linked in the namelist.wps file:
    •  geog_data_path = '/work/n02/n02/sru20/DATA/v3.2/geog' 

  2. you must use the MPI batch system, as for wrf.exe - 2 cores is sufficient though

Running with MODIS land-use data

V3.4.1 WRF-Chem standard release is not compatible with MODIS land-use data. To use MODIS landuse, changes need to be made to the code. A compiled version of WRF-Chem that works with modis land use is located here on hector:  /home/n02/n02/scottan/WRF-CHEM/v3.4.1_modis  Changes have been made to the following files, with changes highlighted with !++sw and !--sw

 chem/chemics_init.F
 share/input_wrf.F
 chem/module_dep_simple.F
 chem/module_ftuv_driver.F

 '/work/n02/n02/lowe/WPS_geog_v3.4/geog/' 

 geog_data_res = 'modis_30s+10m','modis_30s+2m', 

To run ungrib:

  1. the meteorology GRIB files must be linked to in the working directory, using the link_grib.csh script:

    •  ./link_grib.csh <FILEPATH>/* 

    • where <FILEPATH> is the path to the GRIB files (i.e. /work/n02/n02/sru20/gribfiles/ecmwf/)

  2. create a link called Vtable to the file <WPS-ROOT>/ungrib/Variable_Tables/Vtable.ECMWF

  3. run ungrib as a serial job

These steps should be conducted for each set of meteorology GRIB files that you are using. If you're using data sets spread across more than 1 set of GRIB files then duplicate this step for each set of files - changing the file prefix in the namelist.wps file, e.g.:

&ungrib
 out_format = 'WPS',
 prefix = 'SFCFILE',
/

Delete the links to the first set of GRIB files before linking to, and processing, the second set of GRIB files. Also make sure you have a different Vtable for each set of GRIB files. Link file Vtable in the working directory to the correct Vtable e.g for analysing surface files:  ln -s alt_Vtables/Vtable.ECSFC Vtable 

To run metgrid:

  1. again you must use the MPI batch system - 2 cores is again sufficient to get the job done

If more than one set of meteorology input files are being used then add the prefixes to the metgrid section of namelist.wps:

&metgrid
fg_name = '3DFILE', 'SFCFILE'
 opt_output_from_metgrid_path = '/short/w22/sru563/WRF/WRFV_3.4/WPS/met_files_tara/'
 io_form_metgrid = 2,
/

After running all three programs, a series of met files will have been generated for each domain, of the form:

met_em.d01.2011-01-11_00:00:00.nc
met_em.d01.2011-01-11_06:00:00.nc
met_em.d01.2011-01-11_12:00:00.nc
met_em.d01.2011-01-11_18:00:00.nc
met_em.d01.2011-01-12_00:00:00.nc
met_em.d01.2011-01-12_06:00:00.nc
met_em.d01.2011-01-12_12:00:00.nc
met_em.d01.2011-01-12_18:00:00.nc

These met files need to be linked into the WRF-Chem working directory before real can be run.

!!! IMPORTANT !!!

The following line should be removed from the namelist.wps file - SAN 11/01/12

 opt_ignore_dom_center = .true., 

from http://www.mmm.ucar.edu/wrf/users/docs/user_guide/users_guide_chap3.html#_Description_of_the_1 :


6. OPT_IGNORE_DOM_CENTER : A logical value, either .TRUE. or .FALSE., specifying whether, for times other than the initial time, interpolation of meteorological fields to points on the interior of the simulation domain should be avoided in order to decrease the runtime of metgrid. Default value is .FALSE..


WPS stuff

The path to geography data which WPS needs to access has been changed from:  geog_data_path = '/work/n02/n02/sru20/temp/DATA/v3.2/geog'

To:  geog_data_path = '/work/n02/n02/sru20/DATA/v3.2/geog' 

Please take note of this and make the changes in your namelist.wps files!!!!

When running with V3.4.1, use geog data located here:  geog_data_path = '/nerc/n02/n02/lowe/WPS/WPS_geog_v3.4/geog/',