Box and burn: The future of U.S. wildfire policy.

Our Wildfire Policy May Look Like a Muddled Mashup, but It’s the Best We’ve Got

Our Wildfire Policy May Look Like a Muddled Mashup, but It’s the Best We’ve Got

The citizen’s guide to the future.
July 4 2014 12:20 PM

Burn, Baby, Burn—if We Say So

U.S. wildfire policy is a muddled mashup, but it’s the best we’ve got.

A wildfire threatens homes in San Marcos, California, on May 15, 2014. The blazes come amid record temperatures in the state, where the annual wildfire season typically starts much later in the year.

Photo by Jorge Cruz/AFP/Getty Images

The last time California had a drought this severe, 1977, I was the foreman of the North Rim Longshots, a seasonal fire crew stationed at Grand Canyon National Park. By mid-June every national forest in Arizona had a major burn. Then California began one of its epic seasons, becoming a black hole that drafted in crews and aircraft from across the country in a running firefight that didn’t end until the winter solstice.

What was then exceptional has now become a new norm. The largest category of fire at the time was Class G (more than 5,000 acres). Now the big ones—megafires—come one or two orders of magnitude larger. For the last decade, we’ve had one or two a year on average.

The standard narrative for how this happened revolves around a misguided policy that, after the Great Fires of 1910, sought to remove fire of every kind with equal vigor. It didn’t matter if the fire started from lightning or from traditional human practices, of which there were many because fire was widely used to assist foraging, hunting, farming, herding, prospecting, and land use generally, though a fraction of such burning was careless and abusive. Nor did it matter if it was a good fire that enhanced the land or a bad one that degraded biota or burned through human settlements—they were all to be fought. The mere attempt to do so unhinged most ecosystems and primed many for more explosive burning. By the 1980s that legacy merged with a climate tipped toward drought and a rural landscape that once buffered but was now disappearing into wilds or exurbs.


A fuller narrative breaks that story in two. For 50 years official policy sought to stop people from using fire in traditional ways, for good or ill—and to suppress any fires that did occur from any source. This project in “fire exclusion” was overseen by the U.S. Forest Service, which became a benign hegemon. It was a strategy that stopped bad fires that might burn into commercial timber or communities. But because it also eliminated the effects of good fire, it ended up triggering the equivalent of an ecological earthquake.

The second half of the story begins in 1962 when protests challenged the suppression policy and created a civil society for fire’s management. That year Tall Timbers Research Station in Florida hosted its first fire ecology conference, and the Nature Conservancy conducted a controlled burn in Minnesota prairie. Those sparks kindled what became, for the American fire community, a revolution.

In 1968 the National Park Service reformed its national fire policy to encourage the return of good fire. The Forest Service followed with a more comprehensive overhaul after the 1977 season, which mattered because it still remained the keystone agency in the national fire infrastructure. By 1978 the policies were in place to halt, if not roll back, those decades of misguided efforts toward fire exclusion. At the same time, agencies reorganized into more cooperative, less coercive arrangements. “Fire by prescription”—fire set or tolerated so long as it stayed within predetermined boundaries—would replace a simple-minded policy of suppression.

This was a revolution from above, however, and it took root only here and there. It flourished most tenaciously in Florida, where controlled burning—substituting tame fire for wildfire—became a general practice. It helped that, if you kept fire out of most Florida environments, the ill effects in the form of overstocked combustibles were visible within a handful of years. In the West, an equivalent shock might take several decades.

For most federal agencies, however, the 1980s were a lost decade. Climate and politics polarized. The Forest Service became increasingly dysfunctional. Urban sprawl began remaking the rural countryside into exurban enclaves. Suppression returned as the default setting. The Reagan years ended with almost Wagnerian flourish as waves of flame rolled over Yellowstone National Park. The fires burned on and on throughout the summer, mesmerizing the public. While the event successfully alerted the public and the media to the new thinking that had begun 20 years earlier, it left policy and practice unaffected.

The real change came six years later in 1994, an annus horribilis that killed 34 career firefighters, burned through a billion dollars in suppression costs, and convinced the fire community that firefighting on the old model was broken. The movement for reform revived—call it Revolution 2.0. By the end of the decade, Secretary of the Interior Bruce Babbitt declared that the country faced a “national fire crisis.”