Medical Examiner

Teaching Old Drugs New Tricks

Pharmaceutical companies won’t study whether cheap old drugs work better than expensive new ones. But NIH should.

Suppose a researcher discovered that some cheap, long-available drug could treat a devastating disease. Patients wouldn’t need exorbitantly priced new drugs, and they might be able to avoid surgery. Insurers and hospitals would save millions by adopting the economical new treatment.

It would be great news for everyone—except pharmaceutical companies. They don’t care if old, off-patent drugs have novel uses. Their profits depend on new, expensive, patented drugs. They’re not about to undertake costly testing to prove that a discount drug whose patent has expired works as well a pricey new one.

Since the pharmaceutical companies are the economic engine behind drug development, and since there is no incentive for them to find new uses for old drugs, such research is no one’s mission. A Wall Street Journal story last month nicely illustrated the problem, describing the inability of Dr. G. Umberto Meduri to get sufficient backing for a major study to prove what his small, promising studies have indicated: Low doses of common steroids can help prevent death by sepsis, an often deadly bloodstream infection. The steroids, no longer under patent, cost about $50 per course of treatment. Eli Lilly & Co., the Journal points out, has just released a new sepsis drug that costs $7,000 per course. And Lilly is spending millions to promote its drug.

This would seem like a job for the National Institutes of Health. It’s in the United States’ financial interest—as well as public health interest—to see whether steroids work on sepsis. If they’re effective, taxpayers could save millions in Medicare and Medicaid reimbursements. But so far Meduri has failed to get federal funding. A spokesman for NIH says the vast majority of applicants do not get funded, and it’s true that even the best system is going to leave some worthy studies undone. But Meduri’s case and others suggest that novel uses of existing compounds—therapies that could improve lives at little cost—often have a hard time getting attention at NIH, especially if they contradict prevailing medical opinion.

NIH’s main mission is—and should be—basic biomedical research, understanding how the human body functions at a molecular level. NIH is also a center for clinical research, but clinical trials receive only one-sixth the funding that basic science does, frustrating investigators who say clinical research deserves to be treated with more urgency.

For example, promising findings that the amino acid homocysteine might be as good as, or possibly better than, cholesterol at predicting heart disease languished for more than a decade because of lack of funding. Drug companies avoided studying homocysteine for an obvious reason: The treatment for elevated homocysteine is folic acid and B vitamins, which cost next to nothing. No pharmaceutical company wanted to test whether lowering homocysteine is as important as lowering cholesterol. Cholesterol-lowering drugs, after all, earn billions for the pharmaceutical companies.

Again, NIH was the obvious place to turn, but it wasn’t interested. According to a New York Times article on the controversy, NIH was long considered “a kind of ground zero for the cholesterol camp.” Dr. Kilmer McCully, the doctor credited with discovering the homocysteine connection, lost his funding and his position at Harvard Medical School for advocating a line of inquiry so contrary to accepted medical belief. Today, there is powerful evidence that homocysteine levels are a marker not only for heart disease, but also for stroke and Alzheimer’s. Yet, even today, as an NIH Web site points out, “Clinical intervention trials are needed [emphasis added] to determine whether supplementation with vitamin B6, folic acid, or vitamin B12 can help protect you against developing coronary heart disease.”

When a pair of Australian researchers, Barry Marshall and Robin Warren, presented findings in the early 1980s showing that the bacterium Helicobacter pylori, not stress and excess stomach acid, caused most peptic ulcers, they were derided by the medical establishment. At the time, the drug companies were introducing new acid reducers, the staggeringly profitable drugs now available over the counter as Tagamet and Zantac. As the Journal points out, drug companies are often the primary suppliers of information about drugs to physicians. So for years doctors gave little credence to the bacterial infection theory of ulcers. Such a theory would mean that patients could be cured with a short-course of antibiotics, rather than merely receive symptomatic relief from long-term treatment with costly acid reducers. It wasn’t until 1994 that the NIH convened a panel that accepted the infection theory.

This is not to say NIH rarely does clinical studies; it does many. Nor does NIH always fail to notice promising uses for old compounds. Right now NIH is recruiting patients for a massive study on whether selenium and vitamin E can prevent prostate cancer, and it’s even investigating whether the spice turmeric can prevent colon cancer. (NIH is doing another kind of research drug companies won’t: studying the long-term effects of the most popular prescription drugs. Click here for more.)

Finding significant, unexpected uses for drugs has a long history. Some major discoveries in the treatment of mental illness resulted from seeing surprising benefits in mood or behavior in patients who were treated with drugs for purely physical ailments. At the recently concluded meeting of the American Society of Clinical Oncology, the New York Times wrote, “There were particularly promising reports involving new uses for old drugs.”

NIH’s budget has doubled in the last five years to $27 billion. Now that it’s so flush with cash, it’s time for the NIH to search more systematically for potential lifesavers that are already on the pharmacy shelves.