Hylton Haynes

Making sense of the magnitude of the US wildfire problem

Blog Post created by Hylton Haynes Employee on Feb 28, 2014

In follow up to my colleague – Michele Steinberg’s recent blog “Crazy numbers? Wildfire reporting isn’t as easy as it might seem” a remarkable piece of research addressing this matter has recently been published.

Dr. Karen Short a research ecologist with the US Forest Service Rocky Mountain Research Station – Fire Sciences Lab in Missoula Montana recently published a very informative research paper titled: A spatial database of wildfires in the United States, 1992 – 2011.  This paper is an incredibly valuable piece of research in that is describes in detail the level of complexity (1.6 million records from multiple fire occurrence reporting systems) that the researcher had to contend with in developing this Fire Program Analysis Fire Occurrence Dataset for the period 1992-2011.  A second edition that includes the 2012 dataset is scheduled for publication sometime this year.  The core data elements identified for this project where location, discovery date and final acre size.  According to Dr. Short, 82% of the National Fire Incident Reporting System (NFIRS) (for the year 2010) lacked the location data criteria to be included in this spatial database of wildfires.  NFPA and other federal partners are actively engaged in trying to overcome this fundamental limitation of the NFIRS data.

Locations of wildfires1992-2011

Image 1: Locations of wildfire records per year (excluding Alaska and Hawaii) (Short, 2014)

The interesting thing about this is how pervasive the wildfire exposure is becoming as we move from 1992 to 2011.  It will be interesting to see what influence the NFIRS data records could have in the future.

This data is available on the US Forest service Research Data Archive web page.   This data is available in Microsoft Access and ESRI file geodatabase formats. 

Outcomes