Future Tense

How Much Is the Future Worth?

The arcane, fascinating academic debate over how to value our grandchildren’s well-being.

Photo illustration by Natalie Matthews-Ramo. Photo by Brendan Smialowski/Getty Images and Thinkstock.

People make their way through a flooded street during the aftermath of Hurricane Harvey on Tuesday in Houston.

Photo illustration by Natalie Matthews-Ramo. Photo by Brendan Smialowski/Getty Images and Thinkstock.

This article is part of “Future of the Future,” a series about the practice—and future—of prediction.

Scientists have known for some time that climate change was likely to result in more severe hurricanes and flooding along the U.S. Gulf Coast, including in cities like Houston. They couldn’t predict the timing or the specifics, of course. And the beating the city has taken in the past three years, which have brought three “500-year” storms, has exceeded even the most dire warnings. It might not have mattered even if scientists had accurately called the coming carnage: Local officials have consistently ignored the predicted effects of climate change in their planning, opting to develop huge swaths of land that were known to be either prone to flooding or essential to the city’s defenses against floods.

But let’s imagine that scientists had known for decades that a storm such as Hurricane Harvey was likely—and that Houston’s public officials had accepted the science and attempted to plan accordingly. That raises the question: How much would the city have been willing to spend in, say, the 1990s, to mitigate the misery it is experiencing now? Or, more pointedly: How much should it have been willing to spend?

Looking at Houston today, with the city’s streets submerged, dozens dead, and countless homeless, you might think the answer would be, “whatever it took.” But economists have a different answer. It’s based on a concept called social discounting. And the fierce debate over just how to calculate it is gradually reshaping how policymakers think about long-term issues ranging from flood protection to nuclear power to the laying of high-speed rail.

The debate may be relatively obscure; it has played out largely in academic literature and the occasional government-commissioned report, rather than in the media or on the political stage. But the stakes are astronomical: Assume an average social discount rate of roughly 1.5 percent—as the famous 2006 Stern Review on climate change proposed—and global warming becomes an intensely urgent problem that demands deep and immediate fiscal sacrifices. Likewise, the risks involved in nuclear power generation and nuclear waste storage might border on untenable. But set that same rate closer to 3 percent, as Yale economist William Nordhaus suggested, and you might conclude we’re better off making only modest investments today, perhaps through the sort of clean-energy subsidies enacted by the Obama administration. Bump it to 5 or even 10 percent, as others have recommended, and inaction starts to look like the sneaky-smart move—even if it means weathering more disasters like Hurricane Harvey down the road.

So what, exactly, does that rate represent, who decides how to set it, and how is it actually used? How, in other words, do we put a price on doom?

* * *

For starters, it’s worth understanding how discounting works in the abstract. Let’s say that if you do nothing today, you can expect to suffer a loss of $100,000 a decade from now due to a giant robot attack. A classical economist would begin by asking: How much would you pay right this moment to prevent this giant robot attack from happening?

If you said $100,000, you’re implying that a dollar 10 years from now (when there are giant robots) is worth just as much to you as a dollar today. But economists would question your decision. For one thing, behavioral research shows that most people value their present well-being more dearly than their future well-being. In the economic literature, that’s sometimes called “pure time preference.” Even if you cared about your future self, what with the giant robots and all, just as much as you care about your present self, there would still be reasons not to spend that $100,000 today. After all, there’s at least some chance you won’t be alive 10 years from now, in which case you would have needlessly impoverished yourself in the final years of your life. And then there’s the opportunity cost. For example, if you invested that $100,000 today at a healthy interest rate, it might be more like $150,000 by the time the robots rise up. So in effect, you’d come out $50,000 ahead by not spending to prevent the attack. (Note that this assumes the cost of the robot attack can be fully measured in dollars. The accounting gets a lot trickier if lives were lost, psychological traumas inflicted, public confidence in robots irreparably lost, or other hard-to-quantify damages sustained.)

To figure out how much you should spend today, then, you’d need to apply a discount rate: some percentage by which you devalue a future benefit for each year that you have to wait to receive it. At an annual discount rate of 5 percent, $100,000 in the year 2027 would be worth about $65,000 right now. So, according to classical economic theories, you should pay no more than that to build your giant-robot–killing slingshot.

This logic—which reflects the behavior of individuals and corporations in the marketplace—might work well enough when it comes to a straightforward financial decision. And for much of the late 20th century, it was the prevailing model in cost-benefit analyses of projects public and private alike. In 1972, President Nixon’s Office of Management and Budget imposed a discount rate of 10 percent on cost-benefit analyses for all federal agencies, to discourage spending on projects whose returns were unlikely to exceed those of a sound private investment. That rate, which now seems incredibly steep, stood for 20 years until the OMB dropped it to a still-hefty 7 percent in 1992.

But thoughtful economists have long recognized that this approach carries some seriously uncomfortable implications when it comes to spending on future public goods. If you extrapolate a 10 percent discount rate over 50 years, $100,000 in 2067 equates to less than $1,000 today. Make it 100 years, and it’s less than $10.

On an individual level, that might make intuitive sense: Almost nobody plans that far into their future, because in 100 years we’ll all be dead. When talking about a public investment on that kind of time scale, however, you run up against the obvious problem that the people who’ll be receiving the benefit in 50 or 100 years are not the same people making it today. A private citizen might be perfectly justified in preferring $10 today to $100,000 in 2117—but is the government justified in valuing today’s citizens 10,000 times as highly as their great-grandchildren? Not only does that seem rather unfair to the great-grandchildren, but when we’re talking about things like climate change and nuclear power, it amounts to a shockingly high tolerance for risking the near-annihilation of the entire human species.

To return to our initial thought experiment: a relatively high discount rate, based on market returns, would imply that Houston officials of decades past would have been largely justified in prioritizing short-term economic growth over long-term flood risk—even if they had known that something like Hurricane Harvey was bound to happen someday.

Which, to look at Houston today, feels like an unacceptable conclusion.

* * *

Classical discounting was developed to deal mostly with analyses of infrastructure investments that would benefit society over a long time frame, such as the U.S. highway system. Yet some of its limitations became apparent even before the current climate change debate, when planners in the 1970s tried to apply it to questions of nuclear energy generation and the storage of nuclear waste. Like climate change, nuclear energy involves risks on a scale of hundreds of years or more, which simply don’t apply to something like a high-speed–rail project. Likewise, warnings of biodiversity loss prompted economists to think about how values such as “sustainability” could be incorporated into cost-benefit calculations. In recent decades, they’ve increasingly turned to alternative ways of defining the social discount rate—ones that seek to better capture our intuitions about the balance between short-term gain and long-term risk.

A flashpoint in the debate came in 2006, when a team led by the economist Sir Nicholas Stern published a report on the costs of climate change commissioned by the British government. That report, known as the Stern Review, reached the startling conclusion that immediate, painful sacrifices were required to avert the catastrophic long-term effects of human-induced climate change. Specifically, it recommended investing as much as 1 percent of global GDP per year—many times more than the world is currently spending—to curb greenhouse gas emissions. In 2008, Stern raised that figure to 2 percent of global GDP.

The recommendation broke so sharply with prior analyses that it touched off a string of public responses from other economists examining how Stern had arrived at his conclusions. Economist William Nordhaus of Yale identified Stern’s approach to discounting as the crux of his analysis. Rather than use market-based interest rates as a proxy for the social discount rate, Stern had employed a different type of equation—one that took into account questions of equity between present and future generations, along with some rather pessimistic predictions about future economic growth. In Stern’s view, our grandchildren’s well-being deserved just as much consideration as our own: He set the rate of “pure time preference” at a miniscule 0.1 percent, simply to reflect the remote possibility that humans might go extinct in any given year. In other words, he placed the same weight on our descendents’ well-being as he did on ours—and not just our grandchildren, but our grandchildren’s grandchildren’s grandchildren’s grandchildren, and so on. He also re-evaluated the traditional assumption that future generations will be much wealthier than our own and therefore will value a dollar less. His analysis yielded an effective discount rate of just 1.4 percent, which was considered radical at the time.

Nordhaus and many other mainstream economists rejected Stern’s approach, arguing that it substituted arbitrary value judgments for empirical market data. Nordhaus raised the compelling point that such a low discount rate might look less benign when applied to arenas other than climate change:

Imagine the preventive war strategies that might be devised with low time discount rates. Countries might start wars today because of the possibility of nuclear proliferation a century ahead; or because of a potential adverse shift in the balance of power two centuries ahead; or because of speculative futuristic technologies three centuries ahead. It is not clear how long the globe could long survive the calculations and machinations of zero-discount-rate military strategists.

Out of that academic battle emerged a tentative consensus that put discount rates higher than Stern’s, but still far lower than those used in 20th-century policymaking—and significantly lower than market returns to private investment. A 2010 federal working group, tasked with establishing an official cost of carbon to be used in evaluating climate policies, settled on a rate of 3 percent.

Yet some environmental economists today insist that even that rate is much too high, and that the profession’s entire approach to discounting must change when it comes to global risks such as climate change. A 2012 paper by Laurie Johnson, then chief economist for the nonprofit Natural Resources Defense Council, and Chris Hope of the University of Cambridge, argued against the default assumption that future generations will be better off than we are in a world that keeps getting hotter. Johnson, now executive director of the nonprofit Climate Cost Project, points out that the people most affected by climate change are likely to be society’s poorest.* If that’s the case, it would seriously undermine the notion that they’d value a dollar less than we do, even if GDP overall continued to rise. “If you just look at the overall financial impact, you’re going to miss the inequality caused by climate change,” Johnson told me in a phone interview. “You can’t just treat a dollar as a dollar as a dollar.”

Johnson says she’s now convinced that economists should analyze climate change through the lens of minimizing risk, rather than maximizing utility. She notes that most insurance policies have an expected value of less than zero for the buyer—that is, the policy is likely to cost more than it pays out. Yet people still buy them, because if something awful happens in the future, they’ll be much worse off than they are today.

Meanwhile, a growing group of economists has developed yet another approach in which discount rates on long-term projects start out approximating market interest rates, but decline over time. This draws on behavioral research showing that, while people much prefer a dollar today to a little more than a dollar a year from now, they’re less concerned with the difference between waiting 20 years for a payout and waiting, say, 21 years for a somewhat larger payout. In other words, we apply a different framework when thinking about long-term benefits as opposed to short-term ones.

Such considerations, which complicate the process of setting a discount rate, appear to be seeping into the policy mainstream, albeit gradually. The landmark 2014 report by the Intergovernmental Panel on Climate Change included a deep and subtle discussion of discount rates, working its way toward a formula that incorporated issues of justice between generations and between rich and poor, along with risk aversion, without entirely abandoning traditional assumptions about economic growth. The economist Charles Kolstad, who helped lead the team that wrote that chapter, told me he believes the consensus among economists has shifted since 1996, when a previous IPCC report used much higher discounting rates.

But he said the more relevant gap today is not the one that divides economists such as Nordhaus and Stern. Rather, it’s the chasm between the academic consensus—loose though it may be—and the realities of politics and policymaking in a messy democracy. Even as Nordhaus was knocking Stern’s “radical” approach to discounting in 2008, both ultimately favored more aggressive action on climate change than almost any national government was actually taking at the time. Johnson may have found the 2010 U.S. working group’s carbon price to be far too low (it was later revised upward), but Donald Trump this year issued an executive order effectively canceling the entire project of calculating greenhouse gas emissions’ social cost. Back in Houston, the then-head of the Harris County flood control district told ProPublica last year that his agency had no plans to study the potential impacts of climate change on local flooding.

All of that is evidence of the disconnect, Kolstad said, between our own self-interest, and sometimes short-sighted, calculations about the future, and the ones that economists debate in academic journals. “If you asked individuals in Houston in 1990” to invest in hurricane prevention, Kolstad mused, “a lot of them would probably say, ‘Who knows where I’m going to be living in 20 years? If I’m worried, I can move away.’ But the city of Houston isn’t going anywhere.”

*Correction, Sept. 1, 2017: This article originally misidentified the nonprofit of which economist Laurie Johnson is executive director. It’s the Climate Cost Project, not the Carbon Cost Project. (Return.)

This article is part of Future Tense, a collaboration among Arizona State University, New America, and Slate. Future Tense explores the ways emerging technologies affect society, policy, and culture. To read more, follow us on Twitter and sign up for our weekly newsletter.