The Book Club

Lone Heroics vs. Good Medical Systems

Darshak,

I fondly recall our last conversation over tea. And I want to state how much I admire your work, not only your writing but your specialty of pediatric cardiology. I learned as a third-year medical student during my pediatrics rotation that I did not have the emotional reserve to care for very sick children. It takes not only a keen mind but a resilient heart to serve children with serious illness.

I enjoy a lively debate, so let’s begin. The story of my internship is presented as an example of how emotion—specifically, extreme fear and anxiety—can blur, if not paralyze, doctors’ thinking. There is a real difference between learning in the classroom or being supervised on the wards and that first moment when you are the doctor called to make decisions. Recently, the use of simulators, particularly with high-tech mannequins, has helped to bridge this gulf between the world of paper cases and the flesh-and-blood decision making that goes on in the ER or ICU. So, I agree that certain systems and procedures can improve care and help doctors make better decisions. If obtaining an ultrasound routinely on the morning after surgery helps detect early changes in a child that prefigure a cardiac arrest, then this practice clearly should be integrated into routine care. In my own specialty, oncology, we now characterize certain oncogenes—or cancer-causing mutations—to guide us in the treatment of breast cancer or colon cancer.

But let’s examine your contention that I’m focusing too much on the “lone specialist who distills clinical data into a satisfying diagnosis when all others have failed.” You suggest that this ignores the reality that physicians work as part of teams. Pediatric cardiology is certainly a team effort in hospitals. But the vast majority of care involves a solo internist, gynecologist, pediatrician, or primary care physician who sees his or her patients in an office. This differs from your own subspecialty practice. What’s more, “teams” are not immune from cognitive errors. In fact, in some cases, they become more susceptible to them. For example, I write in How Doctors Think about Anne Dodge, a woman who experienced dramatic weight loss and saw more than 30 doctors over the course of 15 years; each thought she was anorexic—and some discounted her own testimony that she was not—when in fact she was suffering from celiac disease. (All patients’ names have been changed to protect their identities.) Each of these doctors can be considered as a member of her “team.” They all communicated in tandem, sharing notes, referring back to her team leader (her internist), and reviewing one another’s records. As it happened, the entire team made a cognitive error, and it’s difficult for me to imagine how a “system” could have corrected it.

That is because algorithms and treatment guidelines are based on prototypes. They are not substitutes for individual thinking. And they break down when cases are atypical or complex. It’s critical to factor in that human biology is highly variable, that diseases have multiple presentations, that symptoms exist with a range of intensity and frequency, while algorithms and treatment guidelines are, of necessity, simplifications. It is impossible to standardize all care.

This is precisely when you need the “lone specialist who distills clinical data.” The 31st member of Anne’s “team” did not follow an algorithm or a treatment guideline. He thought creatively, rearranged the pattern of her symptoms and clinical findings into a different picture, and made a diagnosis that had eluded everyone else. Once processes and procedures are set in motion after anchoring on to a particular diagnosis, it becomes very difficult to break the chain. “Diagnosis momentum” is a major issue in misguided care. So, what we need is not just more systems but always thinking doctors—better education and better role models for young doctors, role models who challenge received wisdom, and thus correctly diagnose patients like Anne Dodge. That 31st doctor is indeed a “hero” in Anne’s case.

Note that I don’t dismiss algorithms and treatment guidelines out-of-hand. Rather, I use them as a point of reference and consider how they may or may not reflect the person in front of me. But in the last few years, I’ve grown increasingly concerned that the promulgation of algorithms and guidelines make it all too easy for us to stop thinking, indeed, to stop leading in diagnosis and treatment, and make us passive followers. Yes, about 80 percent of Americans are correctly diagnosed and treated, according research done in outpatient and hospital settings. The 15 percent to 20 percent who are misdiagnosed are usually failed by errors in doctors’ thinking. Misdiagnosis is extraordinarily costly, not only for the patient but for society: The more advanced a clinical problem becomes, the more intensive and expensive the treatment.

To remedy this, you suggest we should “instead focus our limited resources on improving procedures and structure of medical care.” No argument in terms of improving the procedures and structure of medical care in a broken system. But good thinking is an infinite resource—and it’s free. If we educate medical students, house staff, and experienced physicians about cognitive errors and how to become more self-aware about how our emotional temperature can bias our perceptions and distort our logic, that investment is lifelong and can be lifesaving. Better care begins with better thinking, self-reflection, and understanding of how easy it is to assume that a robust young man couldn’t possibly be having a heart attack—when, in fact, he has. In the book, I tell the story of Pat Croskerry, a doctor in Halifax who misdiagnosed such a man, in part because the patient mirrored Croskerry’s athleticism and love of the outdoors. Croskerry only realized in retrospect how his emotions had too quickly caused him to discharge the man rather than keeping him for observation and further testing.

Yet I was hard-pressed to find any medical student, intern, resident, or attending physician who had thought very deeply about the cognitive pitfalls of practicing medicine—and even the pitfalls of expertise itself. This is an important gap in medical education. Each trainee or attending knew a misdiagnosis had occurred, but few of them realized why. So, I must respectfully disagree. It seems to me that we can begin to fix an ailing system primarily by teaching doctors how to think about their thinking and by having laypeople—patients, their friends, and families—know how doctors think.

I’m curious what you think of the current reward system in medicine? Procedures are richly reimbursed, but thinking (“cognitive medicine”) is poorly paid for. I also wonder how you view patient advocacy, meaning how much education, input, and, indeed, control patients should seek in their care.

Best,
Jerry