Article / Book
View article.
We used a common garden study to quantify variation in growth and drought resistance traits in 99 populations of Elymus elymoides from a broad geographic and climatic range in the western United States. Ecotypes from drier sites produced less biomass and smaller seeds, and had traits associated with greater drought resistance: small leaves with low osmotic potential and high integrated water use efficiency (δ13C). Seasonality also influenced plant traits. Plants from regions with relatively warm, wet summers had large seeds, large leaves, and low δ13C. Irrespective of climate, we also observed trade‐offs between biomass production and drought resistance traits. Together, these results suggest that much of the phenotypic variation among E. elymoides ecotypes represents local adaptation to differences in the amount and timing of water availability. In addition, ecotypes that grow rapidly may be less able to persist under dry conditions. Land managers may be able to use this variation to improve restoration success by seeding ecotypes with multiple drought resistance traits in areas with lower precipitation. The future success of this common rangeland species will likely depend on the use of tools such as seed transfer zones to match local variation in growth and drought resistance to predicted climatic conditions.
View article.
Urban populations rely on a suite of ecosystem services generally provided by the ecological function of natural areas. But the expansion of urban environments and growing suburban or exurban neighborhoods often necessitates destruction of those natural areas for development supporting a growing urban populace. Ecological impacts from development reduce regional biodiversity and negatively affect the ability of remaining natural areas to provide goods and services critical to people. Secondary impacts to biodiversity also occur at broad geographic scales through commodity production supporting urban centers. For example, agricultural production often involves creating agroeconomic systems based largely on farming a limited number of species, and commonly relegates biological diversity to small patches of land deemed unsuitable for crops. Such practices exacerbate the negative biological effects inherent in urban development and drastically increase the need for urban populations to address biological diversity within municipalities. Residents are becoming progressively knowledgeable about environmental issues and are expressing values and concerns to local and regional managing agencies. Governments are responding to public pressure through recommendations intended to reduce resource use, improve wildlife habitat, and provide a local aesthetic. Although the appropriateness of native plants in urban settings is often questioned, the use of regionally specific native vegetation is identified as one method to meet those recommendations. Native plants as primary landscape elements have the added benefit of increasing biodiversity and creating environments capable of providing ecosystem goods and services within urban environments.
View article.
The actions of residents in the wildland–urban interface can influence the private and social costs of wildfire. Wildfire programs that encourage residents to take action are often delivered without evidence of effects on behavior. Research from the field of behavioral science shows that simple, often low-cost changes to program design and delivery can influence socially desirable behaviors. In this research report, we highlight how behavioral science and experimental design may advance efforts to increase wildfire risk mitigation on private property. We offer an example in which we tested changes in outreach messaging on property owners’ interest in wildfire risk
information. In partnership with a regional wildfire organization, we mailed 4564 letters directing property owners to visit personalized wildfire risk webpages. By tracking visitation, we observed that 590 letter recipients (12%) sought information about their wildfire risk and response varied by community. This research–practice collaboration has three benefits: innovation in outreach, evidence of innovation through experimental design, and real impacts on interest in wildfire mitigation among property owners. Future collaborations may inform behavioral and evidence-based programs to better serve residents and the public interest as the risks from wildfires are projected to grow.
National and regional preparedness level (PL) designations support decisions about wildfire risk management. Such decisions occur across the fire season and influence pre-positioning of resources in areas of greatest fire potential, recall of personnel from off-duty status, requests for back-up resources from other areas, responses to requests to share resources with other regions during fire events, and decisions about fuel treatment and risk reduction, such as prescribed burning. In this paper, we assess the association between PLs assigned at national and regional (Northwest) scales and a set of predictors including meteorological and climate variables, wildfire activity and the mobilisation and allocation levels of fire suppression resources. To better understand the implicit weighting applied to these factors in setting PLs, we discern the qualitative and quantitative factors associated with PL designations by statistical analysis of the historical record of PLs across a range of conditions. Our analysis constitutes an important step towards efforts to forecast PLs and to support the future projection and anticipation of firefighting resource demand, thereby aiding wildfire risk management, planning and preparedness.
National monitoring of forestlands and the processes causing canopy cover loss, be they abrupt or gradual, partial or stand clearing, temporary (disturbance) or persisting (deforestation), are necessary at fine scales to inform management, science and policy. This study utilizes the Landsat archive and an ensemble of disturbance algorithms to produce maps attributing event type and timing to > 258 million ha of contiguous Unites States forested ecosystems (1986-2010). Nationally, 75.95 million forest ha (759,531 km2) experienced change, with 80.6% attributed to removals, 12.4% to wildfire, 4.7% to stress and 2.2% to conversion. Between regions, the relative amounts and rates of removals, wildfire, stress and conversion varied substantially. The removal class had 82.3% (0.01 S.E.) user’s and 72.2% (0.02 S.E.) producer’s accuracy. A survey of available national attribution datasets, from the data user’s perspective, of scale, relevant processes and ecological depth suggests knowledge gaps remain.
A climatic dipole drives short- and long-term patterns of postfire forest recovery in the western US
Here, we identify a north–south dipole in annual climatic moisture deficit anomalies across the Interior West of the US and characterize its influence on forest recovery from fire. We use annually resolved establishment models from dendrochronological records to correlate this climatic dipole with short-term postfire juvenile recruitment. We also examine longer-term recovery trajectories using Forest Inventory and Analysis data from 989 burned plots. We show that annual postfire ponderosa pine recruitment probabilities in the northern Rocky Mountains (NR) and the southwestern US (SW) track the strength of the dipole, while declining overall due to increasing aridity. This indicates that divergent recovery trajectories may be triggered concurrently across large spatial scales: favorable conditions in the SW can correspond to drought in the NR that inhibits ponderosa pine establishment, and vice versa. The imprint of this climatic dipole is manifest for years postfire, as evidenced by dampened long-term likelihoods of juvenile ponderosa pine presence in areas that experienced postfire drought. These findings underscore the importance of climatic variability at multiple spatiotemporal scales in driving cross-regional patterns of forest recovery and have implications for understanding ecosystem transformations and species range dynamics under global change.
Climate connectivity, the ability of a landscape to promote or hinder the movement of organisms in response to a changing climate, is contingent on multiple factors including the distance organisms need to move to track suitable climate over time (i.e. climate velocity) and the resistance they experience along such routes. An additional consideration which has received less attention is that human land uses increase resistance to movement or alter movement routes and thus influence climate connectivity. Here we evaluate the influence of human land uses on climate connectivity across North America by comparing two climate connectivity scenarios, one considering climate change in isolation and the other considering climate change and human land uses. In doing so, we introduce a novel metric of climate connectivity, ‘human exposure’, that quantifies the cumulative exposure to human activities that organisms may encounter as they shift their ranges in response to climate change. We also delineate potential movement routes and evaluate whether the protected area network supports movement corridors better than non-protected lands. We found that when incorporating human land uses, climate connectivity decreased; climate velocity increased on average by 0.3 km/year and cumulative climatic resistance increased for ~83% of the continent. Moreover, ~96% of movement routes in North America must contend with human land uses to some degree. In the scenario that evaluated climate change in isolation, we found that protected areas do not support climate corridors at a higher rate than non-protected lands across North America. However, variability is evident, as many ecoregions contain protected areas that exhibit both more and less representation of climate corridors compared to non-protected lands. Overall, our study indicates that previous evaluations of climate connectivity underestimate climate change exposure because they do not account for human impacts.
US fire scientists are developing Potential Wildfire Operational Delineations, also known as ‘PODs’, as a pre-fire season planning tool to promote safe and effective wildland fire response, strengthen risk management approaches in fire management and better align fire management objectives. PODs are a collaborative planning approach based on spatial analytics to identify potential wildfire control lines and assess the desirability of fire before ignition. They offer the opportunity to apply risk management principles with partners before the compressed timeframe of incident response. We sought to understand the potential utility of PODs and factors that may affect their use through semi-structured interviews with personnel on several national forests. Interviewees said PODs offer a promising shift in the wildland fire management dynamic, particularly by facilitating proactive communication and coordination about wildfire response. Successfully employing PODs will require leadership commitment, stakeholder and partner engagement and interdisciplinary staff involvement. Our work offers insights for national forests and other jurisdictions where managers are looking to strengthen coordination and strategic approaches for wildland fire response by utilizing pre-season collaboration and data analytics.
This study used a value of information approach to demonstrate the cost-effectiveness of using satellite imagery as part of the Burn Area Emergency Response (BAER), a US federal program that identifies imminent post-wildfire threats to human life and safety, property and critical natural or cultural resources. It compared the costs associated with producing a Burn Area Reflectance Classification map and implementing a BAER when imagery from satellites (either Landsat or a commercial satellite) was available to when the response team relied on information collected solely by aerial reconnaissance. The case study included two evaluations with and without Burn Area Reflectance Classification products: (a) savings of up to US$51 000 for the Elk Complex wildfire incident request and (b) savings of a multi-incident map production program. Landsat is the most cost-effective way to input burn severity information into the BAER program, with savings of up to US$35 million over a 5-year period.