Life expectancy history: Public health and medical advances that lead to long lives.

You Used to Get One Life. Now You Get Two. #NotDeadYet

You Used to Get One Life. Now You Get Two. #NotDeadYet

Why we live so long.
Sept. 5 2013 5:18 AM

Why Are You Not Dead Yet?

Life expectancy doubled in the past 150 years. Here’s why.

(Continued from Page 1)

Closely related were technologies to move wastewater away from cities, but as Grob points out in The Deadly Truth, the first sewage systems made the transmission of fecal-borne diseases worse. Lacking an understanding of germs, people thought that dilution was the best solution and just piped their sewage into nearby waterways. Unfortunately, the sewage outlets were often near the water system inlets. Finally understanding that sewage and drinking water need to be completely separated, Chicago built a drainage canal that in 1900 reversed the flow of the Chicago River. The city thus sent its sewage into the greater Mississippi watershed and continued taking its drinking water from Lake Michigan.

"Guard Against Tuberculosis" poster from the Office for Emergency Management, Office of War Information, Domestic Operations Branch, Bureau of Special Services.
"Guard Against Tuberculosis"

Courtesy of U.S. National Archives and Records Administration

The germ theory of disease didn’t catch on all that quickly, but once it did, people started washing their hands. Soap became cheaper and more widespread, and people suddenly had a logical reason to wash up before surgery, after defecating, before eating. Soap stops both deadly and lingering infections; even today, kids who don’t have access to soap and clean water have stunted growth.

Housing, especially in cities, was crowded, filthy, poorly ventilated, dank, stinky, hot in the summer, and cold in the winter. These were terrible conditions to live in as a human being, but a great place to be an infectious microbe. Pretty much everyone was infected with tuberculosis (the main cause of consumption), the leading killer for most of the 19th century. It still has a bit of a reputation as a disease of the young, beautiful, and poetic (it claimed Frederic Chopin and Henry David Thoreau, not to mention Mimì in La Bohème), but it was predominantly a disease of poverty, and there was nothing romantic about it. As economic conditions started improving in the 19th century, more housing was built, and it was airier, brighter (sunlight kills tuberculosis bacteria), more weather-resistant, and less hospitable to vermin and germs.

We live like kings today—we have upholstered chairs, clean beds, a feast’s worth of calories at any meal, all the nutmeg (people once killed for it) and salt we could ever want. But wealth and privilege didn’t save royalty from early deaths. Microbes do care about breeding—some people have evolved defenses against cholera, malaria, and possibly the plague—but microbes killed off people without regard to class distinctions through the 1600s in Europe. The longevity gap between the rich and the poor grew slowly with the introduction of effective health measures that only the rich could afford: Ipecac from the New World to stop bloody diarrhea, condoms made of animal intestines to prevent the transmission of syphilis, quinine from the bark of the cinchona tree to treat malaria. Once people realized citrus could prevent scurvy, the wealthy built orangeries—greenhouses where they grew the life-saving fruit.

Improving the standard of living is one important life-extending factor. The earliest European settlers in North America suffered from mass starvation initially, but once the Colonies were established, they had more food and better nutrition than people in England. During the Revolutionary War era, American soldiers were a few inches taller than their British foes. In Europe, the wealthy were taller than the poor, but there were no such class-related differences in America—which means most people had enough to eat. This changed during the 1800s, when the population expanded and immigrants moved to urban areas. Average height declined, but farmers were taller than laborers. People in rural areas outlived those in cities by about 10 years, largely due to less exposure to contagious disease but also because they had better nutrition. Diseases of malnutrition were common among the urban poor: scurvy (vitamin C deficiency), rickets (vitamin D deficiency), and pellagra (a niacin deficiency). Improved nutrition at the end of the 1800s made people taller, healthier, and longer lived; fortified foods reduced the incidence of vitamin-deficiency disorders.

Portrait of French scientist Louis Pasteur in mid-career.
French scientist Louis Pasteur

Photo courtesy of New York Public Library Archives/Tucker Collection

Contaminated food was one of the greatest killers, especially of infants; once they stopped breast-feeding, their food could expose them to typhoid fever, botulism, salmonella, and any number of microbes that caused deadly diarrhea in young children. (Death rates for infants were highest in the summer, evidence that they were dying of food contaminated by microbes that thrive in warm conditions.) Refrigeration, public health drives for pure and pasteurized milk, and an understanding of germ theory helped people keep their food safe. The Pure Food and Drug Act of 1906 made it a crime to sell adulterated food, introduced labeling laws, and led to government meat inspection and the creation of the Food and Drug Administration. 

People had started finding ways to fight disease epidemics in the early 1700s, mostly by isolating the sick and inoculating the healthy. The United States suffered fewer massive epidemics than Europe did, where bubonic plague (the Black Death) periodically burned through the continent and killed one-third of the population. Low population density prevented most epidemics from becoming widespread early in the United States history, but epidemics did cause mass deaths locally, especially as the population grew and more people lived in crowded cities. Yellow fever killed hundreds of people in Savannah in 1820 and 1854; the first devastating cholera epidemic hit the country in 1832. Port cities suffered some of the worst outbreaks because sailors brought new diseases and strains with them from all over the world. Port cities instituted quarantines starting in the 19th century, preventing sailors from disembarking if there was any evidence of disease, and on land, quarantines separated contagious people from the uninfected.


A smallpox epidemic in Boston in 1721 led to a huge debate about variolation, a technique that involved transferring pus from an infected person to a healthy one to cause a minor reaction that confers immunity. Rev. Cotton Mather was for it—he said it was a gift from God. Those opposed said that disease was God’s will. People continued to fight about variolation, then inoculation (with the related cowpox virus, introduced in the late 1700s), and finally vaccination. The fight over God’s will and the dangers of vaccinations (real in the past, imaginary today) are still echoing.

In the early 1900s, antitoxins to treat diphtheria and vaccines against diphtheria, tetanus, and pertussis helped stop these deadly diseases, followed by vaccines for mumps, measles, polio, and rubella.

Anne Schuchat, assistant surgeon general and the acting director of CDC's Center for Global Health, says it’s not just the scientific invention of vaccines that saved lives, but the “huge social effort to deliver them to people improved health, extended life, and kept children alive.” Vaccines have almost eliminated diseases that used to be common killers, but she points out that “they’re still circulating in other parts of the world, and if we don’t continue to vaccinate, they could come back.”

Estimated Annual Deaths Before Vaccine, and in 2004
Historical comparisons of morbidity and mortality for vaccine-preventable diseases with vaccines licensed or recommended before 1980

Courtesy of Sandra W. Roush, Trudy V. Murphy, and the Vaccine-Preventable Disease Table Working Group/Journal of the American Medical Association

Vaccines have been so effective that most people in the developed world don’t know what it’s like to watch a child die of pertussis or measles, but parents whose children have contracted these diseases because of anti-vaccine paranoia can tell them. “The mistake that we made was that we underestimated the diseases and we totally overestimated the adverse reactions [to vaccines]," says a father in New Zealand whose child almost died of an agonizing bout of tetanus.

Schuchat says the HPV vaccine is a huge priority now; only one-third of teenage girls have received the full series of three shots required to protect them against viruses that cause cervical cancer. The vaccines “are highly effective and very safe, but our uptake is horrible. Thousands of cases of cervical cancer will occur in a few decades in people who are girls now.”

A baby is vaccinated against smallpox at an emergency clinic in Karachi during the worst epidemic of smallpox in Pakistan's history, January 1962.
A baby is vaccinated against smallpox at an emergency clinic in Karachi, Pakistan, January 1962.

Photo by Keystone Features/Getty Images

Some credit for the historical decrease in deadly diseases may go to the disease agents themselves. The microbes that cause rheumatic fever, scarlet fever, and a few other diseases may have evolved to become less deadly. Evolutionarily, that makes sense—it’s no advantage to a parasite to kill its own host, and less-deadly strains may have spread more readily in the human population. Of course, sudden evolutionary change in microbes can go the other way, too: The pandemic influenza of 1918–19 was a new strain that killed more people than any disease outbreak in history—around 50 million. In any battle between microbes and mammals, the smart money is on the microbes.

Over the next week, we’ll take a closer look at deaths in childbirth and changes in life expectancy in adulthood. We’ll examine the evolution of old age and the social repercussions of having infants that survive and a robust population of old people, and we’ll list some of the oddball, underappreciated innovations that may have saved your life. In the meantime, please play the Wretched Fate game and send us your #NotDeadYet survival stories on Twitter or

Please read the rest of Laura Helmuth's series on longevity, and play our "Wretched Fate" interactive game to learn how you would have died in 1890.


Correction, Sept. 5, 2013: This article originally misspelled John Stuart Mill's middle name.