So far this year, wildfires have scorched nearly 5 million acres in the U.S. That sounds like a lot, but compared to 2015, the season has been downright tame. Last year at this time, more than 9 million acres had already burned, and by the end of the year, that number would rise to more than 10 million — the most on record. In 2015, the Okanogan grew into the largest fire Washington had ever seen, breaking a record set just the year before. California recorded some of its most damaging fires, including the Valley Fire, which torched around 1,300 homes. More than 5 million acres burned in Alaska alone. But that’s not to say that this year has been without drama. For instance, California’s Soberanes Fire, which was sparked by an illegal campfire in July, is still smoldering. The effort it took to contain that blaze is believed to be one of the most expensive — if not the most expensive — wildfire-fighting operations ever.
With wildfire, such superlatives have, paradoxically, become normal. Records are routinely smashed — for acreage burned, homes destroyed, firefighter lives lost and money spent fighting back flames. A study published earlier this year found that, between 2003 and 2012, the average area burned each year in Western national forests was 1,271 percent greaterthan it was in the 1970s and early 1980s.
Like the extreme hurricanes, heat waves and floods that have whipped, baked and soaked our landscape in recent years, such trends raise the question: Is this what climate change looks like?
In the Proceedings of the National Academies of Science this month, two researchers took on the tricky task of apportioning blame. “People have sort of thrown conjecture out there, saying that the big fire seasons we’ve had since 2000 are attributable to climate change,” said John Abatzoglou, the lead author of the new study and a climatologist at the University of Idaho. “But we wanted to go out and make an effort to try to quantify it.” How much of the recent uptick in fire activity is due to climate change versus other factors, like the natural drought cycle?
Abatzoglou and his co-author, Park Williams, a bioclimatologist at Columbia University’s Lamont-Doherty Earth Observatory, estimate that human-caused climate change was responsible for nearly doubling the area burned in the West between 1984 and 2015. If the last few decades had been simply dry, instead of some of the hottest and driest on record, perhaps 10.4 million fewer acres would have burned, they say.
Wildfire is particularly responsive to temperature increases because heat dries things out. It sucks moisture from twigs and needles in the forest the same way it does from clothes in a dryer, turning this vegetation into the kindling, or “fine fuel,” that gets wildfires going. In arid environments, small increases in temperature dramatically hasten this process, helping to prime the forest for a spark. Abatzoglou and Williams’s models simulated the relationship between temperature and the dryness of forest fuels under multiple scenarios, including ones that reflected the uptick in temperature that humans have caused according to climate models and ones that factored out human-caused changes.
To shore up confidence in their estimates, they repeated the analyses in their study using eight different fuel-aridity metrics and then averaged the results. “One thing that gives me confidence is that all eight of these essentially lead to the same conclusion,” Williams said. “All eight have been increasing. All correlate well with fire.”
In the end, they found that more than half of the observed increase in the dryness of fuels could be attributed to climate change. Fuel aridity, in turn, correlated very closely with fire activity for the time period they looked at — it explained about 75 percent of the variability in acreage burned from year to year. “That means that it is a top dog,” Williams said. “Correlation is not causation, but the correlation is so strong that it’s very hard to get a relationship like this if it’s not real.”
Williams added that as aridity increased, wildfire activity increased exponentially. “This isn’t a gradual process. Every few years we’re kind of entering a new epoch, where the potential for new fires is quite a bit bigger than it was a few years back.”
This isn’t the first study to warn that climate change is increasing the risk of wildfires, and fire scientists generally agree on that point. What’s still up for debate is exactly how much climate change has influenced recent fire activity, and this study is the first attempt at a hard answer. Bob Keane, a research ecologist with the U.S. Forest Service in Missoula, Montana, says the methods are solid and represent the best available science for tackling the question. The basic takeaway is also sound. “It’s a big proportion,” said Keane, who was not involved in the research. But the precise numbers Abatzoglou and Williams produced? Keane wouldn’t take them to the bank.
Fire modeling is full of uncertainty because “fire is a complex process,” Keane said. “It’s hard for any of us to wrap our heads around it and say something that’s meaningful.” Actual fire behavior is influenced by both large-scale processes, such as climate, and smaller factors, such as the slope of a hillside or the strength of the wind on a given day.
And then there’s Smokey Bear. Smokey was the public face of a longstanding and ecologically backward federal fire policy. “Only you can prevent forest fires,” Smokey told us. Some of those ads even characterize wildfire as “shameful waste” that “weakens America.” In fact, fire is a natural and essential process in Western forests. Certain trees, including many lodgepole pine, even need fire to reproduce — their cones only open and release their seeds when heated by flames. But for decades, land managers did their damnedest to prevent and suppress fire, allowing many forests to become overstocked with trees. Those trees became the fuels that set the stage for larger blazes. They also influence a fire’s severity, which can affect the ability of some types of forests to recover from fire.