How to run WRF-SFIRE with real data

From Open Wildland Fire Modeling E community Wiki
Jump to: navigation, search
Back to the WRF-SFIRE user guide.

Running WRF-SFIRE with real data is a process very similar to running WRF with real data for weather simulations. The WRF users page has many documents and tutorials outlining this process. The purpose of this page is to provide a tutorial for using real data with WRF-SFIRE starting from scratch. We begin with a quick outline of the steps involved including links to the output of each step. The user can use these linked files to start from any step or to verify their own results. Due to platform and compiler differences your output might differ slightly from those provided.

This page refers to data sources for the USA only. For other countries, you will need to make appropriate modifications yourself.

Outline

  1. Compile WRF-SFIRE source code with target em_real.
  2. Compile WPS.
  3. Configure your domain.
  4. Download geogrid datasets.
  5. Converting fire data.
  6. Run the geogrid executable.
  7. Download atmospheric data.
  8. Run the ungrib executable.
  9. Run the metgrid executable.
  10. Run real.exe and wrf.exe.

Compiling WPS

After you have compiled WRF-SFIRE, git clone https://github.com/openwfm/WPS at the same directory level as WRF-SFIRE, change to WPS and run ./configure. This will present you with a list of configuration options similar to those given by WRF. You will need to chose one with the same compiler that you used to compile WRF-SFIRE. Generally, it is unnecessary to compile WPS with parallel support. GRIB2 support is only necessary if your atmospheric data source requires it. Once you have chosen a configuration, you can compile with

./compile >& compile.log

Make sure to check for errors in the log file generated.

Configuring the domain

The physical domain is configured in the geogrid section of namelist.wps in the WPS directory. In this section, you should define the geographic projection with map_proj, truelat1, truelat2, and stand_lon. Available projections include 'lambert', 'polar', 'mercator', and 'lat-lon'. The center of the coarse domain is located at ref_lon longitude and ref_lat latitude. The computational grid is defined by e_we/e_sn, the number of (staggered) grid points in the west-east/south-north direction, and the grid resolution is defined by dx and dy in meters. We also specify a path to where we will put the static dataset that geogrid will read from, and we specify the highest resolution (.3 arc seconds) that this data is released in.

&geogrid
 e_we              =  97,
 e_sn              =  97,
 geog_data_res     = '.3s',
 dx = 100,
 dy = 100,
 map_proj  = 'lambert',
 ref_lat = 39.728996
 ref_lon = -112.48999
 truelat1 = 39.5
 truelat2 = 39.9
 stand_lon = -112.8
 geog_data_path = '../WPS_GEOG'
/

The share section of the WPS namelist defines the fire subgrid refinement in subgrid_ratio_x and subgrid_ratio_y. This means that the fire grid will be a 20 time refined grid at a resolution of 5 meters by 5 meters. The start_date and end_data parameters specify the time window that the simulation will be run in. Atmospheric data must be available at both temporal boundaries. The interval_seconds parameter tells WPS the number of seconds between each atmospheric dataset. For our example, we will be using the CFSR dataset which is released daily every six hours or 21,600 seconds.

&share
 wrf_core = 'ARW',
 max_dom = 1,
 start_date = '2018-09-08_00:00:00',
 end_date   = '2018-09-08_06:00:00',
 interval_seconds = 21600,
 io_form_geogrid = 2,
 subgrid_ratio_x = 20,
 subgrid_ratio_y = 20,
/

The full namelist used can be found in pastebin or namelist.wps.

Obtaining data for geogrid

First, you must download and uncompress the standard geogrid input data as explained here. This is a 2.6 GB compressed tarball that uncompresses to around 29 GB. It contains all of the static data that geogrid needs for a standard weather simulation; however, for a WRF-SFIRE simulation we need to fill in two additional fields that are too big to release in a single download for the whole globe. We first need to determine the approximate latitude and longitude bounds for our domain.

We know the coordinates in the center from the ref_lon and ref_lat parameters of the namelist. We can estimate the coordinates of the lower-left corner and upper-right corner by the approximate ratio 9e-6 degrees per meter. So, the lower-left and upper-right corners of our domain are at approximately

ref_lon ± (97-1)/2*100*9e-6
ref_lat ± (97-1)/2*100*9e-6

Therefore for the purposes of downloading data, we will expand this region to the range -112.55 through -112.4 longitude and 39.65 through 39.8 latitude.

Downloading fuel category data

For the United States, Anderson 13 fuel category data is available at the Landfire website. Upon opening the national map, click on the Download Tool [1] and you will see a menu on the right of the screen. Click on the LF 2016 Remap (LF_200), then Fuel, and us_200 13 Fire Behavior Fuel Models-Anderson [2]. Finally, click on the Define Download Area By Coordinates button [3].

Landfire new1.png


This will open a new window on the right with a form that lets you key in the longitude and latitude range of your selection. In this window, we will input the coordinates computed earlier [4], and below we will click the Download Area button [5].

Landfire new2.png


In the next window, click on the Modify button [6]. This will open a new window listing all of the available data products for the selected region. Make sure only the box next to US_200 13 Fire Behavior Fuel Models-Anderson is checked and change the data format from ArcGRID_with_attribs to GeoTIFF_with _attribs. At the bottom make sure Maximum size (MB) per piece: is set to 250. Then go to the bottom of the page and click Save Changes & Return to Summary.

Landfire new3.png


Finally, click on the Download button [7]. The file will be a compressed archive containing, among others, a GeoTIFF file. The name of the file will be different for each request, but in this example we have lf45409014_US_200FBFM13.zip containing the GeoTIFF file US_200FBFM13.tif, which can be found File:US 200FBFM13.tif or US_200FBFM13.tif.

Landfire new4.png


Downloading high resolution elevation data

For the United States, elevation data is also available at the Landfire website. Repeat the steps described above for downloading the fuel data, but selecting instead Topographic and us_Elevation.

Landfire new5.png

Again, we key in the coordinates determined before and click the Download Area button.
In the next window click again Modify, make sure only us_Elevation is selected, change the format to Geotiff and click Save Changes & Return to Summary

Landfire new6.png


In the next window, you should be able to click Download in order to download the GeoTIFF file containing topography. You will obtain the zip file lf34682161_US_DEM2016.zip containing a GeoTIFF file that can be downloaded from File:US DEM2016.tif or US_DEM2016.tif.

Converting fire data

This section describes converting data from geotiff to geogrid format.

In order for geogrid to be able to read this data, we need to convert it into an intermediate format. We will be using a utility program included on wrfxpy repository. For information on how to obtain and use this tool, see How to convert data for Geogrid. We will go to the wrfxpy installation already obtained and move the geotiff files inside the directory.

To convert the fuel and elevation data, we will run

./convert_geotiff.sh US_200FBFM13.tif geo_data NFUEL_CAT
./convert_geotiff.sh US_DEM2016.tif geo_data ZSF

The resulting geo_data/NFUEL_CAT/index file is created as follows.

projection = albers_nad83
dx = 30.0
dy = -30.0
truelat1 = 29.5
truelat2 = 45.5
stdlon = -96.0
known_x = 258.0
known_y = 313.0
known_lon = -112.47513542444187
known_lat = 39.725087912688274
row_order = top_bottom
description = "Anderson 13 fire behavior categories"
units = "fuel category"
type = categorical
signed = yes
category_min = 0
category_max = 14
scale_factor = 1.0
wordsize = 2
tile_x = 515
tile_y = 625
tile_z = 1
endian = little

We have chosen to set the word size to 1 byte because it can represent 256 categories, plenty for this purpose. Notice that the program has changed the number of categories to 14 and uses the last category to indicate that the source data was out of the range 1-13. For the fuel category data, this represents that there is no fuel present, due to a lake, river, road, etc.

We can check that the projection information entered into the index file is correct, by running the gdalinfo binary that is installed with GDAL. In this case, gdalinfo tells us that the source file contains the following projection parameters.

Driver: GTiff/GeoTIFF
Files: US_200FBFM13.tif
Size is 515, 625
Coordinate System is:
PROJCS["USA_Contiguous_Albers_Equal_Area_Conic_USGS_version",
    GEOGCS["NAD83",
        DATUM["North_American_Datum_1983",
            SPHEROID["GRS 1980",6378137,298.2572221010042,
                AUTHORITY["EPSG","7019"]],
            AUTHORITY["EPSG","6269"]],
        PRIMEM["Greenwich",0],
        UNIT["degree",0.0174532925199433],
        AUTHORITY["EPSG","4269"]],
    PROJECTION["Albers_Conic_Equal_Area"],
    PARAMETER["standard_parallel_1",29.5],
    PARAMETER["standard_parallel_2",45.5],
    PARAMETER["latitude_of_center",23],
    PARAMETER["longitude_of_center",-96],
    PARAMETER["false_easting",0],
    PARAMETER["false_northing",0],
    UNIT["metre",1,
        AUTHORITY["EPSG","9001"]]]
Origin = (-1400235.000000000000000,1986555.000000000000000)
Pixel Size = (30.000000000000000,-30.000000000000000)
Metadata:
  AREA_OR_POINT=Area
  DataType=Thematic
Image Structure Metadata:
  INTERLEAVE=BAND
Corner Coordinates:
Upper Left  (-1400235.000, 1986555.000) (112d35' 1.88"W, 39d47'44.01"N)
Lower Left  (-1400235.000, 1967805.000) (112d32'44.10"W, 39d37'50.78"N)
Upper Right (-1384785.000, 1986555.000) (112d24'16.21"W, 39d49' 9.72"N)
Lower Right (-1384785.000, 1967805.000) (112d21'59.86"W, 39d39'16.30"N)
Center      (-1392510.000, 1977180.000) (112d28'30.49"W, 39d43'30.32"N)
Band 1 Block=128x128 Type=Int16, ColorInterp=Gray
  NoData Value=-9999
  Metadata:
    RepresentationType=THEMATIC  

The resulting geo_data/ZSF/index file is created as follows.

projection = albers_nad83
dx = 30.0
dy = -30.0
truelat1 = 29.5
truelat2 = 45.5
stdlon = -96.0
known_x = 258.0
known_y = 313.0
known_lon = -112.47513542444187
known_lat = 39.725087912688274
row_order = top_bottom
description = "National Elevation Dataset 1/3 arcsecond resolution"
units = "meters"
type = continuous
signed = yes
scale_factor = 1.0
wordsize = 2
tile_x = 515
tile_y = 625
tile_z = 1
endian = little

Here we have used word size of 2 bytes and a scale factor of 1.0, which can represent any elevation in the world with 1-meter accuracy, which is approximately the accuracy of the source data.

Again, we compare the projection parameters in the index file with that reported by gdalinfo and find that the conversion was correct.

Driver: GTiff/GeoTIFF
Files: US_DEM2016.tif
Size is 515, 625
Coordinate System is:
PROJCS["USA_Contiguous_Albers_Equal_Area_Conic_USGS_version",
    GEOGCS["NAD83",
        DATUM["North_American_Datum_1983",
            SPHEROID["GRS 1980",6378137,298.2572221010042,
                AUTHORITY["EPSG","7019"]],
            AUTHORITY["EPSG","6269"]],
        PRIMEM["Greenwich",0],
        UNIT["degree",0.0174532925199433],
        AUTHORITY["EPSG","4269"]],
    PROJECTION["Albers_Conic_Equal_Area"],
    PARAMETER["standard_parallel_1",29.5],
    PARAMETER["standard_parallel_2",45.5],
    PARAMETER["latitude_of_center",23],
    PARAMETER["longitude_of_center",-96],
    PARAMETER["false_easting",0],
    PARAMETER["false_northing",0],
    UNIT["metre",1,
        AUTHORITY["EPSG","9001"]]]
Origin = (-1400235.000000000000000,1986555.000000000000000)
Pixel Size = (30.000000000000000,-30.000000000000000)
Metadata:
  AREA_OR_POINT=Area
  DataType=Thematic
Image Structure Metadata:
  INTERLEAVE=BAND
Corner Coordinates:
Upper Left  (-1400235.000, 1986555.000) (112d35' 1.88"W, 39d47'44.01"N)
Lower Left  (-1400235.000, 1967805.000) (112d32'44.10"W, 39d37'50.78"N)
Upper Right (-1384785.000, 1986555.000) (112d24'16.21"W, 39d49' 9.72"N)
Lower Right (-1384785.000, 1967805.000) (112d21'59.86"W, 39d39'16.30"N)
Center      (-1392510.000, 1977180.000) (112d28'30.49"W, 39d43'30.32"N)
Band 1 Block=128x128 Type=Int16, ColorInterp=Gray
  NoData Value=-9999
  Metadata:
    RepresentationType=THEMATIC

Finally, the converted data can be found here geo_data.tar.gz.

Running geogrid

The geogrid binary will create a NetCDF file called geo_em.d01.nc. This file will contain all of the static data necessary to run your simulation. Before we can run the binary, however, we must tell geogrid what data needs to be in these files, where it can find them, and what kind of preprocessing we want to be done. This information is contained in a run-time configuration file called GEOGRID.TBL, which is located in the geogrid subdirectory. The file that is released with WPS contains reasonable defaults for the variables defined on the atmospheric grid, but we need to add two additional sections for the two fire grid data sets that we have just created. We will append the geo_data/GEOGRID.TBL sections to the file geogrid/GEOGRID.TBL.

===============================
name = NFUEL_CAT
        dest_type = categorical
        interp_option = default:nearest_neighbor+average_16pt+search
        abs_path = /absolute/path/to/geo_data/NFUEL_CAT
        priority = 1
        fill_missing = 14
        subgrid = yes
        dominant_only = NFUEL_CAT
        z_dim_name = fuel_cat
        halt_on_missing = no
===============================
name = ZSF
        dest_type = continuous
        interp_option = default:average_gcell(4.0)+four_pt+average_4pt
        abs_path = /absolute/path/to/geo_data/ZSF
        priority = 1
        fill_missing = 0
        smooth_option = smth-desmth_special; smooth_passes=1
        subgrid = yes
        df_dx = DZDXF
        df_dy = DZDYF
        halt_on_missing = no
===============================

For NFUEL_CAT, we will use simple nearest-neighbor interpolation, while for ZSF, we will use bilinear interpolation with smoothing. Other configurations are possible. See the WPS users guide for further information. The full table used can be found pastebin or GEOGRID.TBL.

Once we make these changes to the GEOGRID.TBL file, and ensure that all of the directories are in the correct place (including the default geogrid dataset at ../../WPS_GEOG), we can execute the geogrid binary.

./geogrid.exe

This will create a file called geo_em.d01.nc in the current directory, which can be found here, geogrid_output.tar.gz. The contents of this file can be viewed using your favorite NetCDF viewer.

Here, we have visualized the fire grid variables, NFUEL_CAT and ZSF, as well as the variable HGT_M, which is the elevation data used by the atmospheric model. We can compare ZSF and HGT_M to verify that our data conversion process worked. The colormaps of these two pictures have been aligned, so that we can make a quick visual check. As we see, the two images do have a similar structure and magnitude, but they do seem to suffer some misalignment. Given that the data came from two different sources, in two different projections, the error is relatively minor. Because WPS converts between projections in single precision, by default, there is likely a significant issue with floating point error. We may, in the future, consider making some changes so that this conversion is done in double precision.

Obtaining atmospheric data

There are a number of datasets available to initialize a WRF real run. The WRF users page lists a few. One challenge in running a fire simulation is finding a dataset of sufficient resolution. One (relatively) high resolution data source is the Climate Forecast System (CFS). This is still only 56 km resolution, so no small scale weather patterns will appear in our simulation. In general, we will want to run a series of nested domains in order to catch some small scale weather features; however, we will proceed with a single domain example.

The CFSR datasets are available at the following website, https://www.ncei.noaa.gov/data/climate-forecast-system/access/operational-analysis. We will browse to the pressure and surface directory containing the data for September 08, 2018. Our simulation runs from the hours 00-06 on this day, so we will download the pressure grib files for hours 00 and 06, and the surface grib files for hours 00 and 06.

You can get these files also from here, CFSR_20180908_00-06.tar.gz.

Running ungrib

With the grib files downloaded, we need to process them separately for pressure and surface variables. We need to link the pressure GRIB files into the WPS directory using the script link_grib.csh. This script takes as arguments all of the grib files that are needed for the simulation. In this case, we can run the following command in the WPS directory.

./link_grib.csh <path to>/CFSR_20180908_00-06/pressure/*.grib2

Substitute <path to> with the directory in which you have saved the grib files. This command creates a series of symbolic links with a predetermined naming sequence to all of the grib files you pass as arguments. You should now have two new soft links named GRIBFILE.AAA and GRIBFILE.AAB.

With the proper links in place, we need to tell ungrib what they contain. This is done by copying a variable table into the main WPS directory. Several variable tables are distributed with WPS which describe common datasets. You can find these in the directory WPS/ungrib/Variable_Tables. In particular, the file which corresponds to the CFSR grib files is called Vtable.CFSR, so we issue the following command to copy it into the current directory.

cp ungrib/Variable_Tables/Vtable.CFSR Vtable

We are now ready to run the ungrib executable.

./ungrib.exe

This will create two files in the current directory named COLMET:2018-09-08_00 and COLMET:2018-09-08_06. We need to change their name before processing surface variables. So

mv COLMET:2018-09-08_00 COLMET_P:2018-09-08_00
mv COLMET:2018-09-08_06 COLMET_P:2018-09-08_06

and remove the GRIBFILE.* files doing

rm GRIBFILE.*

Now we can start over for processing surface variables

./link_grib.csh <path to>/CFSR_20180908_00-06/surface/*.grib2

Substitute <path to> with the directory in which you have saved the grib files. You should now have two new soft links named GRIBFILE.AAA and GRIBFILE.AAB. We are now ready to run the ungrib executable again.

./ungrib.exe

This will create two files in the current directory named COLMET:2018-09-08_00 and COLMET:2018-09-08_06. We need to change their name. So

mv COLMET:2018-09-08_00 COLMET_S:2018-09-08_00
mv COLMET:2018-09-08_06 COLMET_S:2018-09-08_06

The four files COLMET_P:2018-09-08_00, COLMET_P:2018-09-08_06, COLMET_S:2018-09-08_00, and COLMET_S:2018-09-08_06 are the resulting files which can be downloaded here, ungrib_output.tar.gz.

Running metgrid

Metgrid will take the files created by ungrib and geogrid and combine them into a set of files. At this point, all we need to do is run it.

./metgrid.exe

This creates two files named met_em.d01.2018-09-08_00:00:00.nc and met_em.d01.2018-09-08_06:00:00.nc, which you can download here, metgrid_output.tar.gz.

Running WRF-SFIRE

We are now finished with all steps involving WPS. All we need to do is copy over the metgrid output files over to our WRF real run directory at WRF-SFIRE/test/em_real and configure our WRF namelist. We will need to be sure that the domain description in namelist.input matches that of the namelist.wps we created previously, otherwise WRF will refuse to run. Pay particular attention to the start/stop times and the grid sizes. The fire ignition parameters are configured in the same way as for the ideal case. Relevant portion of the namelist we will use are given below.

&time_control
  run_days = 0
  run_hours = 6
  run_minutes = 0
  run_seconds = 0
  start_year = 2018
  start_month = 9
  start_day = 8
  start_hour = 0
  start_minute = 0
  start_second = 0
  end_year = 2018
  end_month = 9
  end_day = 8
  end_hour = 6
  end_minute = 0
  end_second = 0
  interval_seconds = 21600
  input_from_file = .true.
  history_interval = 30
  frames_per_outfile = 1000
  restart = .false.
  restart_interval = 180
  io_form_history = 2
  io_form_restart = 2
  io_form_input = 2
  io_form_boundary = 2
  debug_level = 1
/

&domains
  time_step = 0
  time_step_fract_num = 1
  time_step_fract_den = 2
  max_dom = 1
  s_we = 1
  e_we = 97
  s_sn = 1
  e_sn = 97
  s_vert = 1
  e_vert = 41
  num_metgrid_levels = 38
  num_metgrid_soil_levels = 4
  dx = 100
  dy = 100
  grid_id = 1
  parent_id = 1
  i_parent_start = 1
  j_parent_start = 1
  parent_grid_ratio = 1
  parent_time_step_ratio = 1
  feedback = 1
  smooth_option = 0
  sr_x = 20
  sr_y = 20
  sfcp_to_sfcp = .true.
  p_top_requested = 10000
/
 
&bdy_control
  spec_bdy_width = 5
  spec_zone = 1
  relax_zone = 4
  specified = .true.
  periodic_x = .false.
  symmetric_xs = .false.
  symmetric_xe = .false.
  open_xs = .false.
  open_xe = .false.
  periodic_y = .false.
  symmetric_ys = .false.
  symmetric_ye = .false.
  open_ys = .false.
  open_ye = .false.
  nested = .false.
/

It is worth mentioning the different ifire options implemented:

  • ifire = 1: WRF-SFIRE code up to date
  • ifire = 2: Fire code from 2012 in WRF with changes at NCAR

Visit README-SFIRE.md for more details.

The full namelist used can be found pastebin or namelist.input.

Once the namelist is properly configured we run the WRF real preprocessor.

./real.exe

This creates the initial and boundary files for the WRF simulation and fills all missing fields from the grib data with reasonable defaults. The files that it produces are wrfbdy_d01 and wrfinput_d01, which can be downloaded here, wrf_real_output.tar.gz.

To prepare for running the fire model, copy its parameters here:

cp ../em_fire/hill/namelist.fire .
cp ../em_fire/hill/namelist.fire_emissions .

Finally, we run the simulation.

./wrf.exe

The history file for this example can be downloaded here, wrf_real_history.tar.gz.