Skip to main content
U.S. flag

An official website of the United States government

Remote sensing is a powerful tool to measure fire's effects on plant life. How severe was the damage to vegetation? How quickly can a landscape recover? A new project is looking at those questions from a before, during, and after fire perspective, and the Wildland Fire Team at EROS investigates how using different remote sensing technologies together might help.

Remote sensing provides a spatially consistent and comprehensive way by which to monitor vegetation condition at the landscape scale. Therefore, there is considerable interest in leveraging different remote sensing assets to characterize vegetation status pre-fire and vegetation recovery post-fire. Different sensor types capture vegetation characteristics in different ways. For example, passive-optical imagers such as Landsat provide spectral data that relate to characteristics such as vegetation greenness and canopy cover. In contrast, data acquired from lidar instruments can provide estimates of canopy structure such as height and biomass. 

The Post-Fire Hazards Impacts to Resources and Ecosystems (PHIRE): Support for Response, Recovery, and Mitigation project is developing an integrated approach to create linkages between the pre-fire, during-fire, and post-fire environments in ways that have demonstrated importance to stakeholders. As part of this project the USGS Earth Resources Observation and Science (EROS) Center’s Fire Science Team is seeking to gain a better understanding of the advantages of various remote sensing technologies and how they can be used collectively to gain a more complete picture of vegetation condition with a focus on using moderate spatial resolution (~20-50 m) data, suitable for landscape-to-regional scale assessments. 

A graphic with 4 images showing burn severity: Unburned, low, moderate, high, regrowth
Figure 1: Example of canopy height products developed from GEDI data. A. as MTBS thematic burn severity for the Dixie Fire, B. Pre-fire GEDI 95th percentile relative height product, and C. Post-fire GEDI 95th percentile relative height. A significant decrease in height values can be noted in areas mapped as height or moderate severity by MTBS.

To capture vegetation structure consistently for a given study area, the team is exploring the use of the Global Ecosystem Dynamics Investigation (GEDI) spaceborne lidar instrument for capturing wildfire-relevant structure metrics. The wall-to-wall coverage of GEDI data within the study areas offers an opportunity to examine changes in pre- and post-fire vegetation structural data (including biomass) as well as fuels characteristics (e.g., canopy base height and canopy bulk density). An automated methodology has been developed to process GEDI data collected in and around the fire boundary to enable more efficient and consistent derivation of the metrics of interest at the lidar footprint level and extrapolate these to generate spatially comprehensive maps (Fig. 1). Geospatial data sources (e.g., weather, topography, and image reflectance) are extracted for the discrete footprint locations and are used as independent datasets in a machine learning (ML) model (e.g., random forest, XGBoost). Models can then be applied through time using the static (e.g., topographic) and variable (e.g., satellite reflectance) data to determine pre-fire vegetation structure and fuel loads and estimate the amount of post-fire biomass lost and subsequent vegetation recovery.

Another way to characterize vegetation dynamics is to examine how a classified vegetation product changes through time. The team has developed a process that allows us to sample every undisturbed LANDFIRE Existing Vegetation Type (EVT) product 30 m pixel within 3 km of a fire boundary. Pixel locations are split into separate “test” and “train” data pools and then independent geospatial data (e.g., Landsat bands 2-7 30 m Analysis Ready Data (ARD)) from one year before fire (during peak of green) are extracted. Each independent variable can then be used to build an ML EVT model using the train data, with accuracy metrics provided from the withheld test data. This ML model can then be applied backwards and forwards in time to ARD data acquired during the same time of year as that for which the original model was developed. Resulting EVT output maps can then be examined for changes in vegetation classification resulting from fire and over time to reflect classification changes occurring during recovery. 

Three graphics showing burn severity analysis of the Pigeon fire.
Figure 2: Example Analysis on the Pigeon fire in the MHRD Complex in 1999, using a Normalized Burn Ratio (NBR) time series and differenced NBR imagery (1998 to 2000). A. NBR trends by the Monitoring Trends in Burn Severity (MTBS) burn severity categories demonstrate a linkage between higher severity (NBR difference) and higher recovery times. B. Error analyses were completed by comparing the regression model (green line) with the observed averaged recovery times (blue points) and averaged absolute error (orange line). C. The regression model applied to the differenced NBR image shows the spatial distribution of the forecasted recovery time at the fire scale. 

Long-term landscape recovery is also being assessed spectrally using a time-series of composited Landsat imagery (1984-2022). While the Normalized Burn Ratio (NBR) assessment is a commonly used burn severity assessment, as used in the Monitoring Trends in Burn Severity program, this effort seeks to apply a similar methodology on a longer timeframe. In terms of NBR, it is well known that higher burn severity takes significantly longer to recover to pre-fire conditions than areas of lower burn severity. This effort seeks to establish a method to forecast recovery time on a landscape scale, using a differenced-NBR assessment as the starting point. Accounting for meteorological/climatological as well as topographic influences on this recovery will allow for a range of recovery scenarios to be produced. Currently, the working recovery model is being used to generate prototype forecasts of recovery time for historic fires near the Dixie Fire to build a long-term post-fire recovery record (Fig. 2). A more robust working definition of spectral recovery is being developed to include climatological variables such as seasonal precipitation and temperature anomalies, as well as topographical (slope, aspect, and elevation) inputs. More robust modeling methods such as ML are being considered to determine the impacts of these additional inputs and to determine whether they should be incorporated into additional forecasts. 

Next steps will include integrating what we learn from using the different sensor types to characterize vegetation condition and post-fire recovery and developing a method that can leverage their individual strengths into products or tools that are useful for operational post-fire decision support. There is also a need to further explore how these moderate resolution products relate to and link with data collected at significantly finer resolutions, for example, understanding the fidelity loss when comparing vegetation structure assessments derived from terrestrial lidar, airborne lidar, or GEDI. 

Funding: Funding for this project is provided by the Robert T. Stafford Disaster Relief and Emergency Assistance Act (42 U.S.C. 5121 et seq.) and supplemental funding acts for Federal disaster relief activities. Through this funding USGS supports recovery efforts in declared natural disaster areas, to aid recovery efforts from widespread wildfires, devastating hurricanes, prolonged volcanic eruptions, and damaging earthquakes. This enables USGS to repair and replace equipment and facilities, collect high-resolution elevation data, and conduct scientific studies and assessments to support recovery and rebuilding decisions.

 

Get Our News

These items are in the RSS feed format (Really Simple Syndication) based on categories such as topics, locations, and more. You can install and RSS reader browser extension, software, or use a third-party service to receive immediate news updates depending on the feed that you have added. If you click the feed links below, they may look strange because they are simply XML code. An RSS reader can easily read this code and push out a notification to you when something new is posted to our site.

Was this page helpful?