Future Tense

The Trump Administration Is Barely Regulating Self-Driving Cars. What Could Go Wrong?

An Uber driverless car rolls through Pittsburgh, Pennsylvania. 

Jeff Swensen/Getty Images

We tend to think of self-driving cars in utopian terms, as benevolent conveyances that, once optimized, will make their passengers safer by removing their human shortcomings from the transportation equation. But until that future happens, they’re also large robots asking that we trust they won’t harm us. That includes not only passengers but also other drivers, kids that dart out onto the street, bicyclists, storefronts, and basically anyone or anything that might be hit by a hulking pile of steel capable of movement at more than 60 miles per hour.

That trust isn’t going to be easy to win. But President Trump’s Department of Transportation doesn’t seem too concerned. Transportation Secretary Elaine Chao released a set of guidelines on Tuesday for the budding self-driving car industry. Her approach: Let’s not regulate it.

The new guidelines, dubbed Vision for Safety 2.0, actually scale back Obama-era rules released last year that were already quite lenient. Like the old guidance, Chao’s new safety standards are as optional as a sunroof. “This Guidance is entirely voluntary, with no compliance requirement or enforcement mechanism,” reads the document. That means the Lyft and Uber and Waymo, Google’s self-driving car project, are free to ignore them. And considering Uber’s demonstrated disdain for following regulations, it’s hard to imagine that, without hard federal requirements to follow them, the Silicon Valley entrepreneur types behind these companies will comply. (Of course these companies are currently beholden to state and local regulations, which are hardly uniform.)

Enforceability aside, the new guidance includes fewer safety recommendations than last year’s, too. There’s now a 12-point safety standard, as opposed to the 15 questions that Obama’s DoT recommended carmakers consider. Self-driving car manufacturers are still being asked to think about things like how vehicles can safely pull over if something goes awry and how to safely operate on different types of roads, but they exclude considerations like driver privacy, which may become important down the road, since driverless cars by design collect a massive amount of data. While a light regulatory touch will likely help developers innovate quickly and test what works and what doesn’t, a lack of real safety mandates could be a recipe for disaster—especially because these cars need to be tested on human-occupied roads.

As Deborah Hersman, the president and CEO of the National Safety Council put it, since the first self-driving car guidelines were released last year, “DOT has yet to receive any Safety Assessments, even though vehicles are being tested in many states.” Surprise! When regulators don’t require safety compliance, manufacturers don’t comply. A safety assessment is what the DOT is asking carmakers to voluntarily submit to demonstrate their approach to safety and the guidelines.

The guidelines also no longei r apply to cars with partial automation, where drivers are still asked to “remain engaged with the driving task.” The timing of Tuesday’s scaled-back guidelines is telling. On Tuesday, the National Transportation Safety Board found that Tesla’s semi-autonomous Autopilot system that is supposed to robotically steer and control a car “played a major role” in a fatal crash in Florida last year. Joshua Brown, the driver, is the first person to die in a car that drives itself. The NTSB found that Brown’s “inattention” paired with Tesla’s self-driving system “permitted the car driver’s overreliance on the automation.”

While the involvement of a self-driving car is tragic in this case, a disturbing number of people die from manned car accidents every year too. In fact, the past two years represent the highest uptick in automobile-related deaths in more than a half-century. Still, that doesn’t mean that the answer to one problem is to barrel forward with bringing technologies to market that could bring a new wave of fatalities without proper regulations in place to ensure the safety of those systems.

But even if the regulatory agencies are taking a hands-off-the-wheel approach here, that doesn’t mean Congress has to. The House passed a proposal earlier this month that could force self-driving carmakers to make a clear case that their technology is safe enough to drive alongside cars with humans at the wheel. A companion bill was is now being drafted in the Senate.

Still, the House proposal is designed to make it even easier for self-driving cars to hit the road by raising the number of exemptions from regular car regulations self-driving car manufacturers can request—meaning there could be up to 100,000 robocars on American roadways in a few years. Those exemptions could involve things like steering wheels, which autonomous carmakers may not want to include in their designs—or it could involve workarounds regulators haven’t yet thought of. What it ultimately means is we could have more autonomous cars driving on U.S. roads before we have a real sense of what it means for self-driving technology to be designed safely. The roads still belong to the rest of us. And so should the rules.