Earlier this summer, the deputy chief of state and private forestry, James E. Hubbard, sent out a memo laying guidelines for the remainder of wildfire season. His message: Effective immediately, regional, area, and station directors should adopt the default position of “full containment” when approaching all wildfires.
Full containment means going all-out to put a blaze out as soon as possible—an approach that to most of us would seem pretty sensible. But it actually goes against decades of research that suggest a more balanced approach: allowing some wildfires to take their natural course, playing a cleansing role in the forest ecosystems, and clearing out some forms of underbrush and other material (called fuel)—all of which helps reduce the size of wildfires in the future.
Hubbard’s order covered only this season and, he says, does not represent a permanent policy shift. But while many foresters understand the reasoning behind the deputy chief’s request for a change—more on that in a minute—it has some of them worried. Why? Because even if they return to standard procedure next year, the immediate fix puts the long-term solution for managing the country’s forests—and fires—in serious jeopardy and could actually make things much worse.
With a record-breaking season of heat and drought has come an unprecedented season of wildfires. Recently, a Washington state blaze burned more than 20,000 acres of national forest and destroyed 60 homes, forcing 400 families to evacuate and prompting the governor to declare a state of emergency. On the national scale, that’s just a tiny blip. According to the Interagency Fire Center, 31 large fires are currently active, while 7 million acres have already burned across the country this season—well above the 10-year average and charring the previous record set in 2006.
And that means our resources for fighting them—the aviation fleets, the smoke jumpers, and the budgets that pay for it all—are stretched thin. You can measure the cost in dead 20-year-old firefighters in Idaho, or tens of millions of dollars in charred homes and getting boots on the ground. Or you can look at giant blob smoke map Julia Whitty has posted over at Mother Jones. But that high price is why Hubbard wants local forest services to reverse battle tactics and pursue aggressive suppression, even in the remote wilderness. The agency’s $984-million budget for this year has already been busted, with cost projections of $1.4 billion.
Predictive Services—the office that uses a huge amount of recorded data to make short- and long-term predictions about wildfires—may have taken stock of the situation and encouraged Hubbard’s order to fight all fires aggressively. But as with so many short-term solutions in the face of eminent threat, the long view is being set aside, and that’s the scary part.
For more than 100 years, the U.S. Forest Service has been researching, reacting to, and generally trying to understand wildfires, and along the way, our methods have changed significantly. In 1910, the “Big Blowup” destroyed about 3 million acres—roughly the size of Connecticut—in Idaho, Washington, and Montana. The devastation of the Big Blowup, much of which reportedly took place in just 36 hours, cemented the idea of fighting fires aggressively and immediately. In the 1930s, the Selway forest fires in Idaho and Montana led to the “10 a.m.” policy: Try to contain every fire that is reported by the next time 10 a.m. comes around. But in the 1960s and ’70s, foresters began to recognize that fires were important parts of natural systems, and the policy of “fire use” came into play—the idea that local and regional managers could and should allow some fires to burn. That’s partly because in mountainous areas, they are nigh impossible to control, and sometimes it’s deemed too dangerous to send firefighters into a blaze.But they also serve an important purpose: preventing even worse burns. “Thinning the herd,” it seems, happens in the forest—and can actually help build natural hurdles to subsequent blazes.
The 2000s have seen some very bad years for wildfires—in part thanks to a bevy of stupid or reckless human-started blazes—but until this summer the general policy stayed mostly the same: Let forest and land managers make the decisions of what to let burn and what to put out. This year’s shift is a reflection of just how up against the wall we are, nationally, when it comes to resources for fighting wildfires. The scariest part is that things seem to only be getting worse with climate change. A recent climate study published this summer in Ecosphere suggests that over the next 30 years, 38 percent of the planet will see increases in fire activity. By the end of the century, that number is projected to be 62 percent. If this holds true, now may not be the time to go back to fire-fighting techniques popular in the 1910s.
When I wrote earlier this season about the newer technologies we’re now using to fight fires, I spoke with David Calkin, a Ph.D. in economics and a research forester at the Rocky Mountain Research Station in Missoula, Mont. Calkin’s view of forest management is macro, and he doesn’t seem prone to exaggeration. Even when I was pressing him on climate change or how damaging the policy shift could be, he was more likely to talk about the natural progressions of systems than to say that the extreme weather and awesome scope of the burning—which NASA satellites are now monitoring from orbit—were cause for extreme alarm. But there’s no question that he is worried.
“When we look at climate models, some parts of the world are expected to get wetter and hotter, some expected to be drier and hotter,” he said. “But there is a concern that this is a step back. Even in the policy direction it acknowledges that it’s not a good long-term strategy.”
You could almost consider it a surge strategy—like the kind we used in Iraq with, shall we say, mixed success. Throw everything we’ve got at the fires now, to stop it from getting worse. Is there a third way? Until we can make our own rain, cool the climate, and stop shooting live ammo at rocks during drought season, probably not. Even our best technologies and treatments are still relatively puny against Mother Nature.
Any forester will tell you—fire is a part of life. And we’ve got ourselves a firewalker’s dilemma, a choice between setting our shovels against the small stuff and ignoring the smoke of a massive burn on the horizon, or taking the long view. This season, we’ve set our shovels to the small stuff, and lives and housing developments have been saved. But it’s going to set us back in the battle to manage, and to let nature take its course. This season, we’re making the next big burn even bigger.
This article arises from Future Tense, a collaboration among Arizona State University, the New America Foundation, and Slate. Future Tense explores the ways emerging technologies affect society, policy, and culture. To read more, visit the Future Tense blog and the Future Tense home page. You can also follow us on Twitter.
TODAY IN SLATE
Scalia’s Liberal Streak
The conservative justice’s most brilliant—and surprisingly progressive—moments on the bench.
Colorado Is Ground Zero for the Fight Over Female Voters
There’s a Way to Keep Ex-Cons Out of Prison That Pays for Itself. Why Don’t More States Use It?
The NFL Explains How It Sees “the Role of the Female”
The Music Industry Is Ignoring Some of the Best Black Women Singing R&B
Theo’s Joint and Vanessa’s Whiskey
No sitcom did the “Very Special Episode” as well as The Cosby Show.
The Other Huxtable Effect
Thirty years ago, The Cosby Show gave us one of TV’s great feminists.