Future Tense

Software Engineers Need a Crash Course in Ethics

Facebook CEO Mark Zuckerberg

Photo by Kevork Djansezian/Getty Images

This article originally appeared on Pacific Standard magazine’s website.

When Irina Raicu first read about a new software program designed to take just a few details about a person, such as gender and hair color, whether or not the person has tattoos, and the number of minor offenses they’ve committed, and accurately predict how likely he or she would be to commit a felony, she got worried.

Not because Raicu, the Internet ethics program manager at Santa Clara University’s Markkula Center for Applied Ethics, is planning to switch to a career outside the law. She was troubled by something Jim Adler, the program’s creator, told a Bloomberg News reporter:

It’s important that geeks and suits and wonks get together and talk about these things … because geeks like me can do stuff like this, we can make stuff work—it’s not our job to figure out if it’s right or not. We often don’t know.

All too often, Raicu said, the geeks and suits and wonks simply don’t get together—at least not until the damage has already been done.

Like when Facebook inadvertently outed a few University of Texas students who had joined Queer Chorus, a student choir group. The president of the chorus added students to the group’s Facebook page, not realizing it would automatically notify all of their Facebook friends—some of whom the students hadn’t yet come out to, including family members who didn’t take the news well, to put it mildly.

Facebook added additional explanations in the Help Center and has since added additional privacy controls, but for at least those students, it was too late.

That’s part of the reason why Raicu thinks the geeks need an education in professional ethics, and worked with the Markkula Center to create a three-day crash course meant to be taken before creating new programs that could potentially affect the lives of consumers.

Software engineers used to work in big companies where they had checks on the choices they made. “Now, we’re talking about two guys in hoodies in a garage,” Raicu said. They deploy the code now and fix it later. “That’s why we need to get them thinking about this early.”

Software engineering students are required to take the same engineering ethics course any engineering student would, said Shannon Vallor, the software ethics curriculum’s primary author and an associate professor of philosophy at Santa Clara.

In 2000, ABET, the organization that accredits university engineering programs, started formally requiring that every program teach engineering ethics. But Raicu said she doesn’t know of a single engineering ethics course that’s targeted to the challenges specific to software engineering. The classic case studies in engineering textbooks­—such as the Challenger explosion, or the fire-prone Ford Pintos—deal with catastrophic human injuries and deaths that might have been prevented.

That makes sense given the National Society for Professional Engineers’ First Fundamental Canon, which states that an engineer’s most basic ethical duty is “to hold paramount the safety, health, and welfare of the public.” But for software engineers, who build lines of code, not rockets and cars, it’s not so obvious what that duty means. Coders who studied computer science or an entirely unrelated field might never have seen an ethics course at all.

“You really have to think to get beyond the most obvious kinds of possible harm—failures in systems that are critical to public safety,” Raicu said. But that doesn’t mean they’re not there. Programmers’ work can also have much greater reach than that of other kinds of engineers, making consequences even more difficult to anticipate, Vallor added.

There’s no ethics checklist, and there probably never will be. Instead, Vallor and Raicu hope to make ethics part of the software development process, something engineers constantly return to as they see how their work is being used.

In other words, they don’t think that if Mark Zuckerberg and company had sat down and had an intellectual discussion about privacy rights when Facebook was in its dorm room infancy, all of our problems—past, present, and future—would be solved.

“Users will always do things with technology that we didn’t anticipate,” Vallor said. “Ethics isn’t a due diligence where you check off the boxes and you’re done.”

Sometimes ethical issues prompt action. After the outcry when Google revealed that Street View vehicles had been “wardriving”—collecting and storing personal information from unsecured wireless networks—in 2010, the company temporarily grounded Street View cars, appointed a director of privacy to monitor engineering, trained employees on responsible data collection and use, and added stronger privacy protections for its projects. But often there’s simply an apologetic acknowledgement coupled with a resigned sigh to the effect that technology simply moves faster than laws and social norms. Eventually people will adapt.

That struck Raicu as odd. “At some point, it became acceptable that instead of technology meeting our needs, it creates needs and we’ll catch up.” she said. “Why not make technology that’s responsive to what we need and want?”

Any technology creates disruption. Just ask the horse and buggy industry. The difference, Raicu said, is that with software, most people don’t understand the changes under way. It’s hard for people to discuss privacy and the Internet when they don’t really know what’s being collected, or how. She hopes programmers eventually get better at communicating with non-geeks, and take more responsibility for ethical choices before problems get to the lawyers.

The Markkula Center’s course module was released earlier this month and so far seven schools have signed on. Vallor is already working on a follow-up edition, including more case studies.

For example, there’s the Girls Around Me app, which picked up personal data from nearby phones to identify and message single women in the users’ vicinity. It was pulled from Apple’s App Store, but only after an outcry from smartphone users who saw it as less a tool for dating than stalking.

Or Twitter’s new in-tweet “report abuse” button. How can programmers promote free speech without making it so users have no remedy for harassment?

What do Vallor and Raicu hope for? For starters, a more carefully considered approach to development—just because coders can do something doesn’t necessarily mean that they should—combined with more dialogue between the geeks and, well, everybody else.

“These kids are smart, but they’re not used to thinking about ethics in the context of their profession,” Raicu said. “We’re hoping that if we can get professors to talk about this, at least for a day, students will take it from there.”