Shankar Vedantam is a people person. I don't mean that in the ordinary sense, as in, " Do I look like a @#$% people person? " I mean that, in addition to being delightful company, he writes about people. He's interested in how we think.
So when he writes about machines, as he did last week , something funny is going on. His topic was the recent Metro train crash in Washington, D.C., which killed nine people. How did it happen?
One theory is the automation paradox :
The more reliable the system, the more likely it is that humans in charge will "switch off" and lose their concentration, and the greater the likelihood that a confluence of unexpected factors that stymie the algorithm will produce catastrophe. ... After the previous fatal accident on Metro, in which a train overshot the Shady Grove station on an icy night, the National Transportation Safety Board found that the driver of the train had reported overshooting problems at earlier stops but was told not to interfere with the automated controls.
In that case, automation researcher Greg Jamieson points out, "For a year before the accident, the transit authority had put in position a directive that you were not to drive the train in manual." Vedantam concludes: "No matter how clever the designers of automated systems might be, they simply cannot account for every possible scenario, which is why it is so dangerous to eliminate human 'interference.' "
This is the problem we discussed after the February plane crash that killed 50 people near Buffalo, N.Y. Initial evidence indicated that the pilot misunderstood what the autopilot was doing , and, by overriding the machine, caused the crash. Further evidence presented at a May hearing confirms that
the plane, which was collecting ice on its windshield and wings, was slowing to an unsafe speed. But when a warning system began vibrating the control column to get their attention, the captain pulled the nose up when he should have pushed it down. ... [C]onfronting the vibrating column, called a stick shaker, was probably something new and startling. The airline that was contracted with Continental Airlines to make the one-hour commuter flight, Colgan Air, said on Wednesday that it had given the crew simulated training in the activation of the stick shaker, but not in the next step, activation of the stick pusher, which takes control and pushes the nose of the plane down. In this instance, the stick pusher kicked in shortly after the captain pulled instead of pushed. "I don't see any evidence that he ever understood the situation he was in," said Dr. Dismukes ...
Shortly after the Buffalo crash, I outlined three possible responses to such disasters. One was take the controls away from the machines, on the grounds that difficult conditions require human attention and judgment. The opposite approach was to take the controls away from the humans, on the grounds that pilots can't be trusted to override the machine's superior judgment. A third, hybrid solution was to teach the humans how to read and collaborate with the machine's intentions.
The third approach seems to be the one most clearly supported by the evidence in the Buffalo crash: Flight crews must be trained to interpret and interact with their autopilots. Vedantam makes a similar point about automated systems in general: "Several studies have found that regular training exercises that require operators to turn off their automated systems and run everything manually are useful in retaining skills and alertness." We have to know when to second-guess our machines and how to operate without their help. Sometimes, they'll err fatally unless we intervene. But our intervention can itself be fatal. The key is to understand when to step in and when to butt out. That's the role of human intelligence in a machine-controlled world.
It's fun to go to summer sci-fi movies and wonder whether humans or machines would prevail in a mortal showdown. But in the real world, machines aren't our enemies. They're our collaborators. If those 50 people in Buffalo died in a fight between a human and a machine, it wasn't a fight chosen by either side. It was a misunderstanding. And since we're the ones who made the machines, it's our job to teach one another how to work with them, around them, and without them.