Will Starving Yourself Help You Live Longer?
A major new study says what you eat may matter more than how much you eat.
Photograph by Jeff Miller/UW-Madison, University Communications.
Rhesus monkeys do not often appear on the front page of the New York Times, but on July 10, 2009, there were two, pictured side by side: Canto, age 27, and Owen, age 29. In monkey terms, this made them the equivalent of senior citizens, but the striking thing was that Owen looked like he could have been Canto’s beer-drinking, dissipated dad. His hair was patchy, his face sagged, and his body was draped in rolls of fat. Canto, on the other hand, sported a thick (if graying) mane, a slender frame, and an alert, lively mien.
What made the difference? Diet. Since early adulthood, Canto had been fed 30 percent less food than Owen. The two monkeys were part of a long-running study of dietary restriction and aging, conducted at the Wisconsin National Primate Research Center in Madison. Beginning in the late 1980s, the researchers had been deliberately underfeeding Canto and some of his unfortunate colleagues. By late 2008, enough animals had died that the scientists could report meaningful results in Science.
The differences were as striking as the side-by-side photos: The calorie-restricted monkeys were far healthier, in terms of basic measures such as blood pressure, and had far less incidence of age-related disease, such as diabetes and cancer. And they seemed to be living longer: While 37 percent of the control monkeys had died of age-related causes at the time of the report, only 13 percent of the restricted monkeys had done so.
The results seemed to confirm one of the longest-held beliefs about aging: That eating less—a lot less—will help you live longer. Since the 1930s, scientists have learned that restricting diet in many animals, from fruit flies to trout to mice, will extend lifespan, both the average and the maximum. The phenomenon has been known for so long, and observed so often, that it’s been accorded the status of near-dogma in some circles. A devoted group of believers who think the principle should extend to humans has practiced caloric restriction, sometimes eating as little as 1,200 calories per day.
Now a new paper has come out in Nature, reporting a parallel monkey study conducted by the National Institute of Aging. The NIA study began around the same time as the Wisconsin study, with similar experimental conditions. But the Nature authors found no increase in lifespan; the calorically restricted animals lived no longer, statistically, than their well-fed cousins. Even stranger, the NIA control monkeys, the ones who ate a lot, actually lived just as long as the calorie-restricted Wisconsin primates. What gives?
Many of us simply roll our eyes and click away when yet another medical study contradicts the last study—so what else is new? Coffee’s bad for you, until it’s good for you—and so is red wine. Antioxidants are essential, or they’re useless. And so on. Contradictory studies are an essential part of the science-news stream—and, in fact, an important part of science itself. But that doesn’t make it any less frustrating.
The parallel monkey studies are some of the most important and closely watched experiments on aging to be conducted in our lifetimes. It was expected, even assumed, that the NIA results would show that caloric restriction extended longevity—the holy grail of aging research.
The fact that it didn’t, and that the two studies conflict, has unintentionally revealed a different truth about diet and aging. In both studies, the monkeys that ate less were healthier by a number of measures—and suffered far less from age-related disease. Even better, when taken together, both studies reveal a different path toward living a healthier life—one that doesn’t require self-starvation. To understand the new findings, let’s begin with a taster’s tour of the strange, fascinating world of caloric restriction.
The concept goes back to the 1930s, when a young professor of nutrition named Clive McCay noticed that hatchery trout seemed to live longer when they were fed less. At the time, he was looking for more economical ways to raise the fish (it was the Great Depression, after all), and the long-lived, underfed fish were too small to interest anyone. But the phenomenon puzzled him enough that he set up an experiment in his Cornell lab, in which he fed one group of rats about one-third less than another group of rats. In his much-cited 1935 paper, he showed that the restricted rats lived more than 60 percent longer than the normally fed animals—more than 800 days, versus an average of 500 days.
This astonishing result was the equivalent of humans living to age 125 or beyond. Even more amazing was that the experiment was repeatable, not just in rats and mice. Over the years, various researchers have shown that caloric restriction can extend life in bats, dogs, and even spiders, and on down to nematode worms and single-celled organisms like yeast. After decades of work, it remains the only way known to increase maximum lifespan. So a lot is riding on the concept, scientifically speaking.
The idea made its way into pop culture, too. In On the Road, Jack Kerouac writes:
I stumbled out of Harrisburg—cursed city! The ride I proceeded to get was with a skinny haggard man who believed in controlled starvation for the sake of health. When I told him I was starving to death as we rolled East, he said, “Fine, fine. There’s nothing better for you. I myself haven’t eaten for three days. I’m going to live to be 150 years old.”
In the 1990s, Leonard Guarente of MIT discovered a class of longevity genes in yeast called sirtuins that appear to be activated by a lack of food. Sirtuins appeared to be “conserved” in evolution, meaning that they appear in nearly all species, on up to humans. Sirtuins are thought to have evolved as a way to enable animals to survive periods of famine. They seem to work by regulating certain metabolic pathways and reduce the amount of damage cells endure.
It appeared, then, that caloric restriction seems to activate some sort of deep survival mechanism common to nearly all life forms. If researchers could somehow identify and isolate that mechanism, they’d be that much closer to some kind of longevity pill. Except for one inconvenient fact: Caloric restriction itself does not always work.
The study published today is itself a specimen of scientific longevity, dating all the way back to the late 1980s, not long after the founding of the National Institute of Aging in 1985 as part of the National Institutes of Health. One of the new institute’s first major long-term projects was to test the effects of caloric restriction (CR) in monkeys, the lab animal closest to humans. Such studies in humans are problematic, as one might imagine, because it’s not easy to convince people to spend decades starving themselves—and even if you could, you’d have to wait a lifetime for results (actually longer, if it worked as advertised). Monkeys can’t cheat on their diet or complain about it, and they only live 30-odd years or so.
The initial group of 60 monkeys was split into to two groups. Half were allowed to eat a full ration of food while the rest were given a portion equal to about 25 percent less. The monkeys were soon joined by another 60 animals; some were young, between 0 and 8 years old, while the rest were older, between 16 and 23 when the experiment started.
The data started coming out in dribs and drabs, in mundane descriptive studies at first. Then in 2003, the NIA team reported hopefully that “preliminary evidence suggests that CR will have beneficial effects on morbidity and mortality.” While 80 percent of the monkeys were still alive, the restricted animals had better measures of cardiovascular health, hormone levels, and blood-sugar management, an early indicator of diabetes risk. So it came as a bit of a surprise, eight years later, to find that the hungry monkeys are not actually living longer.
This was a surprise, and yet at the same time not surprising. The history of calorie restriction research is strewn with odd results that have been left unexplained (at best) or outright ignored (at worst). When Steven Austad of the University of Texas–San Antonio tested wild-caught mice, for instance, he found no caloric-restriction-induced increase in lifespan. In another study, researchers created 42 different cross-bred mouse strains and found that in a third of the strains, caloric restriction actually seemed to shorten lifespan. And even Clive McCay, the father of caloric restriction, found weird results: In his 1935 experiment, caloric restriction worked only in the males.
In fact, caloric restriction really seemed to work best in standard laboratory mice. This may be because they are predisposed to eat a lot, gain weight, and reproduce early—and thus are more sensitive to reduced food intake. (Slate’s Daniel Engber has written about how overfed lab mice have distorted scientific research.)
But in a long-awaited, well-funded monkey study like this, an “odd” result could not be ignored. Still stranger was the fact that even though the underfed monkeys were healthier than the others, they still didn’t live longer. They had lower incidence of cardiovascular disease, as well as diabetes and cancer—and when these diseases did appear, they did so later. “To me I think it’s one of our very interesting findings,” says lead author Rafael de Cabo. “We can have a dramatic effect on healthspan [the length of healthy life] without improving survival.”
Even odder was the fact that the NIA’s control monkeys seemed to be doing much better than the Wisconsin controls. In fact, the NIA controls seemed to be on track to live as long, or longer, than the Wisconsin calorie-restricted monkeys. Some of them were approaching 40 years old, previously the highest recorded age for Rhesus monkeys. (Four of the NIA monkeys have actually surpassed 40 at this writing.) What was that about?
At first, it seemed like a scientist’s nightmare: The control group is indistinguishable from the test group. In clinical trials, a result like this would kill any drug candidate. Then de Cabo took a closer look at a seemingly minor difference between the Wisconsin and NIA studies: the animals’ diets.
De Cabo is attuned to food. A native Spaniard who’s reputed to make some of the best paella this side of Cadiz, he would seem an unlikely advocate for caloric restriction. “I love to cook,” he says. “Would I like to practice caloric restriction? I don’t think so.”
It didn’t take him long to realize that the animals’ food was more important than anyone had thought. The NIA monkeys were fed a natural-ingredient diet, made from ground wheat, ground corn, and other whole foods; the Wisconsin animals ate a “purified” diet, a heavily refined type of food that allowed the researchers to control the nutritional content more precisely. Because the NIA monkeys were eating more natural ingredients, de Cabo realized, they were taking in more polyphenols, micronutrients, flavonoids, and other compounds that may have health-promoting effects.
Furthermore, the NIA diet consisted of 4 percent sucrose—while in the Wisconsin diet, sucrose accounted for some 28 percent of the total calories. High sugar consumption is thought to be a primary driver of obesity, diabetes, and possibly some cancers. “In physics, a calorie is a calorie,” says de Cabo. “In nutrition and animal physiology, there is more and more data coming out that says that the state of the animal is going to depend more on where the calories are coming from.”
In other words, it matters whether you eat at Whole Foods, like the suburban-Maryland NIA monkeys—or at the ballpark, like the Wisconsin monkeys. Guess which works out better in the end?
But what does all this really mean for humans? Is it really “healthier” to starve oneself, as some people believe? Or will this latest monkey study finally let us off the hook?
Calorie-restriction data in humans has been pretty spotty, for good reason. You try cutting back 30 percent of your food intake, and see how it goes. (Slate’s Emily Yoffe attempted it a few years ago; read her account here.) Most studies have been short-term, and none have measured longevity, for obvious reasons. One of the few true long-term trials in humans came out of the drama-ridden Biosphere project from the late 1990s.
Funded by billionaire Ed Bass, Biosphere was a 3-acre sealed environment built in the Arizona desert that was supposed to simulate life in a space station. The facility would be completely self-sufficient, with greenhouses producing all the crew’s food, as well as the oxygen they’d need to breathe. The expedition scientist, Roy Walford, happened to be a strong proponent and key early researcher of caloric restriction. (He also bore a striking resemblance to Mr. Clean.)
When it became clear early in the Biosphere project that the earthbound “space station” could not grow enough food to feed the crew of eight, Walford took the opportunity to place his fellow crew members on a restricted diet—30 percent lower in calories, but dense in nutrients. In his study based on the two-year experience, Walford reported that the main effect of caloric restriction was to drastically lower his fellow crew members’ cholesterol levels, to 140 and below—well below the average for people in the industrialized world. Walford concluded that a calorie-restricted diet would have the same beneficial effects that he and other scientists had observed in mice.
Walford would never get to test his hypothesis fully; he died of Lou Gehrig’s disease in 2004, and in his later years he blamed his poor health on the Biosphere experience. Other, larger trials of calorie restriction in humans have pointed to similar results: In the short term, it’s been shown to benefit overweight or obese people, which is no surprise. And a longer-term study of voluntary calorie-restrictors has pointed to improved arterial health, as well as better blood glucose management and other markers of aging.
But so far, there’s no evidence that humans gain any longevity benefit from calorie restriction. “That data will not emerge until about 2040,” says Brian Delaney, president of the Calorie Restriction Society.
And when it does, chances are any effect of calorie restriction may vary from person to person, depending on genetics. “It’s complicated,” says Nir Barzilai of Albert Einstein College of Medicine in New York.* “To some of us it might work, and for some of us it might be dangerous.”
Several studies have shown that excessive leanness—seen often in calorie-restricting humans—can be as risky as obesity. Taken together, these studies suggest that the optimal body-mass index is about 25, which is on the verge of being overweight.
But if it’s OK to be almost overweight, it might not pay to go beyond that. Another key difference between the two monkey studies has to do with the definition of “ad libitum.” While the Wisconsin control-group monkeys were allowed to stuff themselves, with the equivalent of an all-you-can-eat buffet for several hours at feeding times, the NIA monkeys were given a fixed amount of food. “You could view it as the Wisconsin monkeys were overindulging, like the rest of the American population,” says Rozalyn Anderson, a member of the Wisconsin team. Compared with their Wisconsin brothers, then, the NIA monkeys in the non-calorie-restricted control group were arguably practicing a mild form of calorie restriction—and that, Anderson suggests, might have made a difference.
For decades, ever since McCay, the holy grail of aging research has been to extend maximum lifespan—to push out the frontiers of human longevity, past 100, 120, or more. But while in theory those limits may be malleable, a careful look at these major primate studies shows that they might not be, in practice. Even so, calorie restriction does seem to reduce—drastically in some cases—one’s risk of developing age-related diseases like cancer and diabetes. So while calorie restriction may or may not make you live longer, overeating and obesity will certainly make you die sooner. And if eating less doesn’t always increase lifespan, it does improve “healthspan,” our allotment of healthy years.
In the next few years, we’re going to learn a lot about how different genetic types respond to medicine, diet, and other things. And while we might not (yet) know how to live forever, more of us will be able to avoid a long, sad decline and will live longer, healthy lives. If we can get there simply by eating the right foods, but not too much, and avoid becoming obese—then just knowing that is a pretty good start.
Correction, Sept. 4, 2012: This article originally misstated the name of Albert Einstein College of Medicine. (Return to the corrected sentence.)
Bill Gifford has written for Outside, Wired, Men's Health, and other magazines. He is working on a book about the future of medicine.