When Doctors Need Advice, It May Not Come From a Fellow Human

Innovation, the Internet, gadgets, and more.
June 3 2014 12:57 PM

A.I., M.D.

When doctors need advice, it may not come from a fellow human.

(Continued from Page 1)

So far, computers have gotten really good at parsing so-called structured data—information that can easily fit in categories. In health care, this data is often stored as billing codes or lab test values. But this data doesn’t necessarily capture patients’ full range of symptoms or even their treatments. Images, radiology reports, and the notes doctors write about each patient can be more useful. That’s unstructured data, and computers are less savvy at handling it because it requires making inferences and a certain understanding of context and intent. 

“Computers are notoriously bad at understanding English,” said Peter Szolovits, the director of MIT’s Clinical Decision Making Group. “It’s a slow haul, but I’m still optimistic.” 

Computers are getting better at reading unstructured information. Suppose a patient says he doesn’t smoke. His doctor checks “no” in a box—structured data, easily captured by a machine.

Advertisement

But then the doctor notes that the patient’s teeth are discolored or that there are nicotine stains on his fingers— clues that the patient in fact does smoke. Soon a computer may be able to highlight such discrepancies, bringing to the fore information that otherwise might have been overlooked.  

In recent years, universities, tech companies, and venture capital firms have invested millions in improving computer analysis of images and words. Companies are popping up to capitalize on findings in studies that suggest AI can be used to improve care. 

“Artificial intelligence—ultimately that’s where the biggest quality improvements will be made,” said Euan Thomson, a partner at venture capital firm Khosla Ventures. 

Many challenges remain, experts say. Among them is the tremendous expense and difficulty of gaining access to high-quality data and of developing smart models and training them to pick up patterns.

Most electronic medical record-keeping systems aren’t compatible with each other. The data is often stored in servers at individual clinics or hospitals, making it difficult to build a comprehensive reservoir of medical information.

Moreover, the systems often aren’t hooked up to the Internet and therefore can’t be widely distributed or accessed like other information in the cloud. So, in contrast to the vast amount of data on Google and Facebook, the information can’t be mined from anywhere by those interested in analyzing it.

From the perspective of privacy advocates, this makes some good sense: A researcher’s treasure trove is a hacker’s playground.

According to Dr. Russ Altman, the director of Stanford University’s Biomedical Informatics Training Program, “it’s not the greatest time to talk about” health records on the Web, given security scandals such as the Edward Snowden leaks and the Heartbleed bug.

Drawing the Line

Also standing in the way are concerns about how far computers should encroach on doctors’ turf. As AI systems get smarter, experts say, the line between making recommendations and making decisions could become murkier. That could cause regulators to view the systems as a medical device, subject to the review of the U.S. Food and Drug Administration.

Wary of the time and expense required for FDA approval, companies engineering the systems—at least for now—are careful not to describe them as diagnostic tools but rather as information banks.

“The FDA would be down on them like a ton of bricks because then they would be claiming to practice medicine,” said MIT’s Szolovits.

At the moment, he said, the technology isn’t good enough to tell doctors with 100 percent certainty what the best course of treatment for a patient may be. Others agree.

Back at her clinic in Long Island, Mariwalla is thankful for the information that the AI system can provide. For the patient with that blistering skin condition, she took the machine’s suggestion for an alternative medication. The patient has recovered, Mariwalla said.

But she’s careful to add that she made the call herself—based in part on her conversation with her patient. “That’s where medical judgment comes in,” she said. “You can’t rely on a system to tell you what to do.”

This piece originally appeared in Kaiser Health News, an editorially independent program of the Kaiser Family Foundation.

Daniela Hernandez has written about health, neuroscience, AI, and tech for the Minneapolis Star Tribune, ESPN the Magazine, Scientific American Mind, and Wired.

  Slate Plus
Culturebox
Dec. 18 2014 11:48 AM Behind the Year of Outrage  Here’s how Slate tracked down everything we were angry about in 2014.