Rhesus monkeys do not often appear on the front page of the New York Times, but on July 10, 2009, there were two, pictured side by side: Canto, age 27, and Owen, age 29. In monkey terms, this made them the equivalent of senior citizens, but the striking thing was that Owen looked like he could have been Canto’s beer-drinking, dissipated dad. His hair was patchy, his face sagged, and his body was draped in rolls of fat. Canto, on the other hand, sported a thick (if graying) mane, a slender frame, and an alert, lively mien.
What made the difference? Diet. Since early adulthood, Canto had been fed 30 percent less food than Owen. The two monkeys were part of a long-running study of dietary restriction and aging, conducted at the Wisconsin National Primate Research Center in Madison. Beginning in the late 1980s, the researchers had been deliberately underfeeding Canto and some of his unfortunate colleagues. By late 2008, enough animals had died that the scientists could report meaningful results in Science.
The differences were as striking as the side-by-side photos: The calorie-restricted monkeys were far healthier, in terms of basic measures such as blood pressure, and had far less incidence of age-related disease, such as diabetes and cancer. And they seemed to be living longer: While 37 percent of the control monkeys had died of age-related causes at the time of the report, only 13 percent of the restricted monkeys had done so.
The results seemed to confirm one of the longest-held beliefs about aging: That eating less—a lot less—will help you live longer. Since the 1930s, scientists have learned that restricting diet in many animals, from fruit flies to trout to mice, will extend lifespan, both the average and the maximum. The phenomenon has been known for so long, and observed so often, that it’s been accorded the status of near-dogma in some circles. A devoted group of believers who think the principle should extend to humans has practiced caloric restriction, sometimes eating as little as 1,200 calories per day.
Now a new paper has come out in Nature, reporting a parallel monkey study conducted by the National Institute of Aging. The NIA study began around the same time as the Wisconsin study, with similar experimental conditions. But the Nature authors found no increase in lifespan; the calorically restricted animals lived no longer, statistically, than their well-fed cousins. Even stranger, the NIA control monkeys, the ones who ate a lot, actually lived just as long as the calorie-restricted Wisconsin primates. What gives?
Many of us simply roll our eyes and click away when yet another medical study contradicts the last study—so what else is new? Coffee’s bad for you, until it’s good for you—and so is red wine. Antioxidants are essential, or they’re useless. And so on. Contradictory studies are an essential part of the science-news stream—and, in fact, an important part of science itself. But that doesn’t make it any less frustrating.
The parallel monkey studies are some of the most important and closely watched experiments on aging to be conducted in our lifetimes. It was expected, even assumed, that the NIA results would show that caloric restriction extended longevity—the holy grail of aging research.
The fact that it didn’t, and that the two studies conflict, has unintentionally revealed a different truth about diet and aging. In both studies, the monkeys that ate less were healthier by a number of measures—and suffered far less from age-related disease. Even better, when taken together, both studies reveal a different path toward living a healthier life—one that doesn’t require self-starvation. To understand the new findings, let’s begin with a taster’s tour of the strange, fascinating world of caloric restriction.
The concept goes back to the 1930s, when a young professor of nutrition named Clive McCay noticed that hatchery trout seemed to live longer when they were fed less. At the time, he was looking for more economical ways to raise the fish (it was the Great Depression, after all), and the long-lived, underfed fish were too small to interest anyone. But the phenomenon puzzled him enough that he set up an experiment in his Cornell lab, in which he fed one group of rats about one-third less than another group of rats. In his much-cited 1935 paper, he showed that the restricted rats lived more than 60 percent longer than the normally fed animals—more than 800 days, versus an average of 500 days.
This astonishing result was the equivalent of humans living to age 125 or beyond. Even more amazing was that the experiment was repeatable, not just in rats and mice. Over the years, various researchers have shown that caloric restriction can extend life in bats, dogs, and even spiders, and on down to nematode worms and single-celled organisms like yeast. After decades of work, it remains the only way known to increase maximum lifespan. So a lot is riding on the concept, scientifically speaking.
The idea made its way into pop culture, too. In On the Road, Jack Kerouac writes:
I stumbled out of Harrisburg—cursed city! The ride I proceeded to get was with a skinny haggard man who believed in controlled starvation for the sake of health. When I told him I was starving to death as we rolled East, he said, “Fine, fine. There’s nothing better for you. I myself haven’t eaten for three days. I’m going to live to be 150 years old.”
In the 1990s, Leonard Guarente of MIT discovered a class of longevity genes in yeast called sirtuins that appear to be activated by a lack of food. Sirtuins appeared to be “conserved” in evolution, meaning that they appear in nearly all species, on up to humans. Sirtuins are thought to have evolved as a way to enable animals to survive periods of famine. They seem to work by regulating certain metabolic pathways and reduce the amount of damage cells endure.