"Everybody" does not need to learn to code.

Maybe Not Everybody Should Learn to Code

Maybe Not Everybody Should Learn to Code

The citizen’s guide to the future.
Aug. 19 2013 7:45 AM

Maybe Not Everybody Should Learn to Code

A software engineer’s take on the new education call to arms.

(Continued from Page 1)

But if you aren’t dreaming of becoming a programmer—and therefore planning to embark on a lengthy course of study, whether self-directed or formal—I can’t endorse learning to code. Yes, it is a creative endeavor. At its base, it’s problem-solving, and the rewards for exposing holes in your thinking and discovering elegant solutions are awesome. I really think that some programs are beautiful. But I don’t think that most who “learn to code” will end up learning anything that sticks.

One common argument for promoting programming to novices is that technology’s unprecedented pervasiveness in our lives demands that we understand the nitty-gritty details. But the fact is that no matter how pervasive a technology is, we don’t need to understand how it works—our society divides its labor so that everyone can use things without going to the trouble of making them. To justify everyone learning about programming, you would need to show that most jobs will actually require this. But instead all I see are vague predictions that the growth in “IT jobs” means that we must either “program or be programmed” and that a few rich companies want more programmers—which is not terribly persuasive.

You could argue that people have an obligation to be responsible consumers, so they should have an idea of what their proprietary software is doing. People would definitely interact with software differently if they knew something about whether their personal data is “secure.” But there are several other things that everyone should—and does not—know, such as civic knowledge or even more basic understandings of other educational topics like logic, science, etc. For most, software is just another black box that they will happily coexist with, and that’s OK.


But I wonder why people are comfortable with thinking of computers as a scary black box in the first place. Computers do only what people tell them to do, and yet it is absurdly common to hear, “Windows crashed again! Call over the IT guy—it’s so complicated!” So many users do not feel empowered to understand how to use computers well, and I think that the urgency to spread programming is a symptom of this feeling. Perhaps if everyone had some practice telling computers what to do, tech intimidation wouldn’t be so prevalent. But many of the efforts to promote programming could have the opposite effect, turning computer science into the latest technical fad.

This is my nightmare vision—“everyone” approaches programming as a set of arbitrary technical details just because he or she should. With only bits and pieces, users can’t appreciate the ways that languages are designed to solve problems, and they are left with an even larger black box. With this approach to programming, their knowledge will eventually float into the ether in the company of other meaningless knowledge, like how to talk nicely to that broken Nintendo 64 cartridge.

This can be avoided if you have concrete problems to solve. Take a biologist who frequently uses software to analyze data. A graphical interface might not be able to extract all the information she wants, so she needs the flexibility of the command line to write her own scripts. This is a compelling reason to extend programming knowledge outside of the IT realm. Her knowledge comes from a focused learning of the topics or a language that is specialized for her application. She doesn’t just “learn Python.”

I’m not sure it’s even possible to teach everyone how to code, but I do know that to mandate programming as a general education requirement would displace something else that we’re already failing to teach, and that’s not good, either. We don’t need everyone to code—we need everyone to think. And unfortunately, it is very easy to code without thinking.

This article arises from Future Tense, a collaboration among Arizona State University, the New America Foundation, and Slate. Future Tense explores the ways emerging technologies affect society, policy, and culture. To read more, visit the Future Tense blog and the Future Tense home page. You can also follow us on Twitter.

Chase Felker is a Slate software engineer.