In the past few years, programming has gone mainstream, as celebrities from Chris Bosh to President Obama jump on the “everyone should learn to code” bandwagon. The idea is that teaching kids to code will make them employable and help American students keep up with their competition abroad.
But this idea has generated substantial whining among programmers—including me. Like a good computer scientist, I took the edict quite literally and had a pretty visceral reaction to it. I value the spread of programming knowledge to the extent that I value generally making all kinds of knowledge accessible. It gives a window into the crossover that computer science has with some interesting intellectual problems, and this exposure will undoubtedly help the field as it tries to attract the most capable people to join its ranks. Something has to trick people into banging their heads against the wall, whether it’s plush offices, intellectual rigor, prestige, or sheer beauty in the work.
I imagine that people get excited about physics by reading some Bad Astronomy or Richard Feynman’s QED, but no one jumps from reading about physics to doing physics—there is a substantial amount of school missing there. With programming, however, there’s very little separation between appreciating and making something. From the first days of learning a language like Python, you can write self-contained programs that build real things. As you move forward in your education, you’ll find an almost Wild West world that’s open both in the cultural and physical senses. And unlike other sciences, your credentials are primarily the programs you have written, not the stature of the professor whose lab you cleaned.
But this lack of authority comes at a price: It also means that anyone can call himself a teacher and spew nonsense to unsuspecting students. Several professionals have written about the alarming number of programming job applicants who overestimate their credentials. It seems that many computer science instructors—whether in person or in a self-guided online class—are failing to teach people to recognize what they don’t know. And if you don’t know what you don’t know, credentials are pretty easy to come by.
I’m not going to pretend that you need to be a genius to do useful things with a computer. It is possible to build a website solely from a facile use of Rails syntax. But in both the long and short term, you need to know more than that, and popular Web and mobile courses might not make this clear. Restricting yourself to learning about the technology du jour makes it easy to memorize habits rather than thinking them through. Eventually those technologies and programming languages will go out of style, and you’ll need a flexible understanding to tweak your habits for the next thing. And without knowing more of the bigger picture, you’re forced to hack away at a problem, which can take you pretty far until you run into one that is better solved by more careful design.
Frankly, just the idea that you can learn to code in a year gives me the creeps: I would be terrified if someone with only a couple of classes were writing programs for me, not because he (of course, and unfortunately, most programmers are men) has learned anything wrong—but because of what he doesn’t know.
For those aspiring to be a professional programmer, these services look like a great resource as a starting point; I completed Codecademy’s Python exercises and thought they were pretty well done. And there’s certainly no reason for this information to be under the price lock of a university, especially when there is such demand for it in the market. Several of my colleagues are successful self-taught programmers, and these resources make it easy for the size of that group to increase.