Will Starving Yourself Help You Live Longer?
A major new study says what you eat may matter more than how much you eat.
It appeared, then, that caloric restriction seems to activate some sort of deep survival mechanism common to nearly all life forms. If researchers could somehow identify and isolate that mechanism, they’d be that much closer to some kind of longevity pill. Except for one inconvenient fact: Caloric restriction itself does not always work.
The study published today is itself a specimen of scientific longevity, dating all the way back to the late 1980s, not long after the founding of the National Institute of Aging in 1985 as part of the National Institutes of Health. One of the new institute’s first major long-term projects was to test the effects of caloric restriction (CR) in monkeys, the lab animal closest to humans. Such studies in humans are problematic, as one might imagine, because it’s not easy to convince people to spend decades starving themselves—and even if you could, you’d have to wait a lifetime for results (actually longer, if it worked as advertised). Monkeys can’t cheat on their diet or complain about it, and they only live 30-odd years or so.
The initial group of 60 monkeys was split into to two groups. Half were allowed to eat a full ration of food while the rest were given a portion equal to about 25 percent less. The monkeys were soon joined by another 60 animals; some were young, between 0 and 8 years old, while the rest were older, between 16 and 23 when the experiment started.
The data started coming out in dribs and drabs, in mundane descriptive studies at first. Then in 2003, the NIA team reported hopefully that “preliminary evidence suggests that CR will have beneficial effects on morbidity and mortality.” While 80 percent of the monkeys were still alive, the restricted animals had better measures of cardiovascular health, hormone levels, and blood-sugar management, an early indicator of diabetes risk. So it came as a bit of a surprise, eight years later, to find that the hungry monkeys are not actually living longer.
This was a surprise, and yet at the same time not surprising. The history of calorie restriction research is strewn with odd results that have been left unexplained (at best) or outright ignored (at worst). When Steven Austad of the University of Texas–San Antonio tested wild-caught mice, for instance, he found no caloric-restriction-induced increase in lifespan. In another study, researchers created 42 different cross-bred mouse strains and found that in a third of the strains, caloric restriction actually seemed to shorten lifespan. And even Clive McCay, the father of caloric restriction, found weird results: In his 1935 experiment, caloric restriction worked only in the males.
In fact, caloric restriction really seemed to work best in standard laboratory mice. This may be because they are predisposed to eat a lot, gain weight, and reproduce early—and thus are more sensitive to reduced food intake. (Slate’s Daniel Engber has written about how overfed lab mice have distorted scientific research.)
But in a long-awaited, well-funded monkey study like this, an “odd” result could not be ignored. Still stranger was the fact that even though the underfed monkeys were healthier than the others, they still didn’t live longer. They had lower incidence of cardiovascular disease, as well as diabetes and cancer—and when these diseases did appear, they did so later. “To me I think it’s one of our very interesting findings,” says lead author Rafael de Cabo. “We can have a dramatic effect on healthspan [the length of healthy life] without improving survival.”
Even odder was the fact that the NIA’s control monkeys seemed to be doing much better than the Wisconsin controls. In fact, the NIA controls seemed to be on track to live as long, or longer, than the Wisconsin calorie-restricted monkeys. Some of them were approaching 40 years old, previously the highest recorded age for Rhesus monkeys. (Four of the NIA monkeys have actually surpassed 40 at this writing.) What was that about?
At first, it seemed like a scientist’s nightmare: The control group is indistinguishable from the test group. In clinical trials, a result like this would kill any drug candidate. Then de Cabo took a closer look at a seemingly minor difference between the Wisconsin and NIA studies: the animals’ diets.
De Cabo is attuned to food. A native Spaniard who’s reputed to make some of the best paella this side of Cadiz, he would seem an unlikely advocate for caloric restriction. “I love to cook,” he says. “Would I like to practice caloric restriction? I don’t think so.”
It didn’t take him long to realize that the animals’ food was more important than anyone had thought. The NIA monkeys were fed a natural-ingredient diet, made from ground wheat, ground corn, and other whole foods; the Wisconsin animals ate a “purified” diet, a heavily refined type of food that allowed the researchers to control the nutritional content more precisely. Because the NIA monkeys were eating more natural ingredients, de Cabo realized, they were taking in more polyphenols, micronutrients, flavonoids, and other compounds that may have health-promoting effects.
Furthermore, the NIA diet consisted of 4 percent sucrose—while in the Wisconsin diet, sucrose accounted for some 28 percent of the total calories. High sugar consumption is thought to be a primary driver of obesity, diabetes, and possibly some cancers. “In physics, a calorie is a calorie,” says de Cabo. “In nutrition and animal physiology, there is more and more data coming out that says that the state of the animal is going to depend more on where the calories are coming from.”
In other words, it matters whether you eat at Whole Foods, like the suburban-Maryland NIA monkeys—or at the ballpark, like the Wisconsin monkeys. Guess which works out better in the end?
Bill Gifford has written for Outside, Wired, Men's Health, and other magazines. He is working on a book about the future of medicine.