Future Tense

In Defense of Algorithms

They get a bad rap—but that’s because people don’t understand them. 

The algorithms that guide food preparation are more commonly called recipes.

Photo by Stock-Asso/Shutterstock

This article is part of Future Tense, a collaboration among Arizona State University, New America, and Slate. On Thursday, Dec. 10, Future Tense will host a three-hour conversation on “The Tyranny of Algorithms” in Washington, D.C. For more information and to RSVP, visit the New America website.

OK, I’m biased, given the countless hours I have spent over the past three decades creating algorithms, implementing them on computers, and then writing about them in academic papers that few people read. But if nothing else, all those nights and weekends of writing computer code have given me an appreciation of the power of procedure, not only in programming but also in so much else. Algorithms don’t always involve complicated feats of programming; at heart, they are sequences of steps to move toward a goal—and they are so fundamental that we can easily forget what our world would be like without them.

Thinking about making a nice home-cooked meal? That’s off the table without the algorithms that guide food preparation, more commonly called recipes. Trying to figure out which of multiple possible routes to take on your commute to work or school? Without algorithms, you’d be unable to make that decision yourself, and Waze wouldn’t be able to do it for you. Without algorithms, people couldn’t build houses, birds couldn’t build nests, and spiders couldn’t weave webs.

Recognized or not, algorithms underlie our approach to tasks from the momentous to the mundane—from choosing a college or a home to knowing that when getting dressed we need to put our socks on before our shoes and not the other way around. When performing a task we know well, such as tying shoelaces, we don’t even give a moment’s thought to the role of order, to the way completion of one step sets the stage for the next. But when confronted with a less familiar task, like assembling a newly purchased piece of furniture that arrives as a mystifying collection of bolts, brackets, and pieces of wood, we immediately realize that the algorithmic guidance of even a poorly written instruction booklet is infinitely better than none at all.

Algorithms go back to the earliest written history and far before that as well. Take boustrophedon, which describes a method of writing, sometimes used by the ancient Greeks and Etruscans, in which alternating lines are written in alternating directions, just as you would plow a field or mow a lawn. The word boustrophedon combines the Greek terms for ox and turn, an echo of the back-and-forth plowing algorithm that was undoubtedly reinvented thousands of times as agriculture overspread much of the world. The word algorithm itself has a complex and interesting etymology, with roots in Middle English, Greek, Latin, and the name of one of history’s greatest mathematicians, the ninth-century algebra pioneer Muhammad ibn Musa Al-Khwarizmi.

Today, of course, we can use computers to perform algorithms. In fact, it is no exaggeration to say that computation itself is inseparable from the algorithms that enable it and therefore that computers would serve no purpose at all without the algorithms that run on them. When you send a text message, do an Internet search, or stream a movie to a laptop computer, you are benefiting from a nested set of interdependent algorithms—some that perform basic mathematical functions such as addition, others that build on addition to perform multiplication, and still others that build on those functions to turn images, video, audio, or text into packets of binary data and route them to their destinations.

While our electronic devices have dramatically quickened the speed of computation, the algorithms they run remain fundamentally human endeavors, designed with creativity and often ingenuity, both building on and contributing to a mathematical heritage that dates back millennia. Take Internet search, an action that has become so commonplace that we forget how incomprehensible it would have been just a few generations ago. A growing fraction of the entire corpus of human knowledge resides on the Internet, accessible in under a second to much of the world’s population. That this can happen at all is a testament to many things, not the least of which is the suite of algorithms created by Google and its predecessors to efficiently index a mind-boggling amount of data and, even more impressive, call it up in an instant in response to a few keywords.

Algorithms also save lives. They underpin computed tomography, or CT, and magnetic resonance imaging, or MRI, scans, help physicians manage suspected stroke patients, and drive the weather forecasting models that predict violent storms. In the future, increasingly sophisticated collision avoidance technologies will lead to an enormous reduction in the number of people killed and injured in motor vehicle accidents each year.

And algorithms can cause a lot of trouble. They have been blamed for contributing to racial profiling by police in Chicago, for terminating the health benefits of thousands of low-income senior citizens in California, and for flagging an 8-year-old boy as an airport security risk.

In short, algorithms reflect the people who create them and the organizations that use them: mostly (but not always) good and well-intentioned but also imperfect and sometimes caught up in unintended consequences. That’s not a reason to throw all algorithms out the window, but it is certainly a reason to ensure that they are analyzed not only through traditional technical measures such as the “false positive rate” but also in terms of the social impacts arising from their use. And while we should absolutely address those instances when algorithms are used in ways that result in social harms, we shouldn’t forget that algorithms also help manage the supply chain that brings food to grocery stores, gas to gas stations, and electricity to homes and offices. Without them, big data would be incomprehensible data, clouds wouldn’t compute, and tablets would still all be made of stone.