In December 2001, Aaron Swartz took to his blog to describe an unusual dream:
I walked in, not quite knowing what to expect. It was a modernly-designed loft with sleeping quarters, a meeting room, ping-pong tables and computers spread around. Windows were everywhere and light streamed making the place feel airy and bright. Many of my friends from the Internet were there, as well as a number of people I didn't know, but who seemed very friendly. We were all about the same age.
We were working together on a project that we thought would change the world. We were committed to it, and worked well as a team: we helped each other out with what needed to be done, and kept each other's enthusiasm up. We worked hard, but we also took time off to play ping-pong or go water-sliding. There was a bulletin board to coordinate events as we tended to keep irregular hours.
The team was rather large, but I got to know everyone on it well and we became good friends. We all worked to support each other, and everything was run democratically. New folks who wanted to help were invited in, but the group had to agree on them before they could join.
We learned all the time, both finding out the skills we needed to know ourselves, and teaching each other how to improve. While we knew there was a lot to be done, we focused on our one project with a single-minded determination, finishing it and solving all the problems that it raised.
Whether he knew it or not, that dream world he was describing really existed, minus the water slides, and Swartz would become one of its most important heirs.
There is a photograph of a teenaged Aaron Swartz sitting on a bench, wearing a “GNU’s Not Unix” T-shirt, and chatting with law professor Lawrence Lessig. This is a family picture of sorts, a snapshot of three generations of data idealists.
Swartz’s T-shirt came from a group called the Free Software Foundation, and the slogan refers to the operating system the foundation created. GNU/Linux (GNU stands for “GNU’s Not Unix”) is a free, community-generated alternative to the Unix operating system, conceived in the 1980s by a software developer named Richard Stallman.
In the 1970s, Stallman was one of many programmers affiliated with MIT’s Artificial Intelligence Lab. As Steven Levy tells it in Hackers, his history of the early days of modern digital computing, the AI Lab was a sort of programming utopia that drew computer enthusiasts from all around—a Brook Farm for the digital age. It was a flat, non-hierarchical system, where you were judged on the caliber of your work, not age or status or title or educational background. “There were people that were hangers on, that were not real students or staff members. They just were there, and they were helping,” says Brewster Kahle, an AI Lab affiliate in the early 1980s. “And that openness was very creative and wonderful.”
Stallman and his fellow hackers wrote and maintained the software that was central to the lab’s research. But in many ways, this was less a workplace than a political awakening. “Hackers spoke openly about changing the world through software, and Stallman learned the instinctual hacker disdain for any obstacle that prevented a hacker from fulfilling this noble cause,” writes Sam Williams in Free as in Freedom, his 2002 biography of Stallman. “Chief among these obstacles were poor software, academic bureaucracy, and selfish behavior.”
Steven Levy calls Stallman “the last of the true hackers”—of everyone affiliated with the AI Lab, he was the one who most avidly ordered his life around the group’s ethic. While other affiliates moved on to private industry and occasionally to great fortune, Stallman never lost his radical idealism. (In this way, he is sort of the Ian MacKaye of computing.) Stallman, who still keeps an office in what MIT now calls its Computer Science and Artificial Intelligence Laboratory, founded the Free Software Foundation in the mid-’80s and remains its president today. The organization is dedicated to the proposition that free software (“free as in freedom, not free as in beer”) is a moral imperative.
Swartz didn't know Stallman personally, but he was inspired by the programmer’s morals, and by the fact that he’d fostered an organization that took ethics seriously while also getting things done. In 2002, Swartz saw Stallman speak at the O’Reilly Open Source Convention. “The most interesting thing I learned … was how human Stallman is,” Swartz wrote afterward. “As people asked him long questions he would practice his dance steps. He’d make jokes about everything. I could really see being him.”
At that same conference in 2002, the keynote address was delivered by Lawrence Lessig—the Richard Stallman of the copyright movement. Lessig, then at Stanford and now at Harvard, has written prolifically about inequities in U.S. copyright law. Corporate interests pressure Congress to keep extending the scope and duration of copyright protections, Lessig believes, keeping material out of the public domain and stifling the kind of innovation that arises from sharing and remixing. In the introduction to his 2004 book Free Culture, Lessig notes that “all of the theoretical insights I develop here are insights Stallman described decades ago.”
In 2001, Lessig came up with his own variation on Stallman’s Free Software Foundation, an organization he’d call Creative Commons. Lessig wanted to reform copyright laws by giving content creators more options for licensing their work—allowing them to specify, for instance, that anyone could use a photo non-commercially, or that people could adapt it without having to ask permission.
Lessig hired Lisa Rein, a writer and archivist, to help create the Creative Commons licensing metadata.* She in turn suggested that Swartz, whom she knew from the Semantic Web community, should be the one to supervise the site’s metadata implementation. “I won’t lie and say it wasn’t hard at the time to convince these people that I needed this 14-year-old on the project,” she remembers. “I took a pretty big hit for it politically at the time. Until they met him. It didn't make sense to anybody until they met him.”
Ben Adida, a contractor on the project, remembers Swartz as an active, important participant, “an extremely talented software engineer who just happens to be 14 and wearing a T-shirt that's three sizes too big for him.” The two worked closely on the metadata implementation, and not always harmoniously. "He had incredibly high standards and we were not meeting them," remembers Adida. "He was very critical of my work and he said so publicly. The guy was pretty hard on me."
Swartz expected excellence from those around him, but he also cared deeply about connecting with his new colleagues. In April 2002, he flew to San Francisco for a Creative Commons event, and Rein chaperoned him around the city. She saw it as her role to ensure he met the right people. “I sort of decided he was going to be [either] a good superhero or a super-villain,” she says. Rein introduced him to everyone she knew in the open-access world, people who would become his friends and collaborators. For Swartz, it was an eye-opening, you-are-not-alone experience.
“I know that Aaron spent a lot of his early life really struggling with the fact that he was really interested in and curious about a lot of things that were not interesting to other people,” says Seth Schoen, a technologist with the Electronic Frontier Foundation. Schoen met Swartz in 2002, when he was 23 and his soon-to-be friend was 15.* “I think a lot of the people he regarded as his peers and who regarded him as a peer were at least 10 years older than he was and often much more. And he felt that that was where the action was, where his interests were.”
Swartz had found his people, and he’d met them in the flesh, not just as names on the To: line of a mailing list. From 2002 to 2004, he spent a lot of time in their company, attending and speaking at numerous conferences about emerging technologies and tech issues. (When he was at home, he spent a lot of time fighting with his brothers and parents about who got to use the family Segway.)
Despite his youth, Swartz wasn’t treated like some adorable tech-world mascot. “I mean, socially, it was a lot of nerdy people there,” remembers Wes Felter. “It wasn’t a really sophisticated scene, honestly. … He was obviously less mature than other people there, but not by a wide margin.” Days were spent listening to speeches; after-dinner activities might include “a trip to the Apple Store to check out the then-new iMac, the one that looked like a tablet attached to a globe,” as Joey “Accordion Guy” deVilla put it in a recent online remembrance.
By 2004, Swartz had helped launch Creative Commons, worked on the RSS 1.0 standard, created and maintained a popular blog, and had a hand in countless other large and small projects. But he was also about to turn 18, and regardless of his suspicions about organized schooling, he was expected—by his parents and most everyone else—to go to college. In the summer of 2004, he enrolled at Stanford University.
* * *
For Swartz, Stanford felt less like the dream world of Creative Commons and more like a return to everything he hated about high school. By his first week in Palo Alto, he’d blogged that “it doesn't strike me that most Stanford students (and professors) are exceptionally bright.”
College was not an intellectual dream world—it was just another place that needed fixing. “If I wanted to start a more effective university, it would be pretty simple,” he wrote on his third day at Stanford. “Hire the smartest people and accept the smartest students, get them to work on projects that interest them ... organize a bunch of show-and-tells and mixers, and for the most part let them figure stuff out on their own.”
Swartz lived in Roble Hall, in a suite with three other people. “He was a pretty introverted guy, I was a pretty introverted guy,” remembers his roommate, Rondy Lazaro. “A few people in his dorm knew him for his work in computer programming. To them he was the Aaron Swartz.” To most people, though, he was just the guy at the end of the hall with the recumbent bicycle and filing cabinet.
Swartz studied sociology. “The other night, when [redacted] asked me why I switched from computer science to sociology, I said it was because Computer Science was hard and I wasn’t really good at it, which really isn’t true at all,” he wrote on his blog. “The real reason is because I want to save the world.”
In several posts, he chronicled his Stanford experience like an anthropologist taking field notes. He wrote about campus groupthink and critiqued his classes. Occasionally, his loneliness peeks through:
Stanford: Day 58
Kat and Vicky want to know why I eat breakfast alone reading a book, instead of talking to them. I explain to them that however nice and interesting they are, the book is written by an intelligent expert and filled with novel facts. They explain to me that not sitting with someone you know is a major social faux pas and not having a need to talk to people is just downright abnormal.
I patiently suggest that perhaps it is they who are abnormal. After all, I can talk to people if I like but they are unable to be alone. They patiently suggest that I am being offensive and best watch myself if I don’t want to alienate the few remaining people who still talk to me.
Swartz may not be the most reliable narrator of his college experience. Multiple people who knew him at Stanford say he wasn’t a complete loner, and that he was always ready to have a conversation. But the blog—which he chose not to share with his fellow students—is not the work of someone who’s reveling in undergrad life. His classes were insufficiently engaging and challenging, and he pined for a girl he called TGIQ (for “The Girl in Question”) who always manages to “disappear before I can catch up with her.” Instead of chatting up his fellow freshmen, he haunted the office hours of professors like Lessig. “It’s fun, although it bothers me to bother them,” he wrote.
Worst of all was that his fellow students didn’t think the way he thought. Before he’d met the Creative Commons crew, he’d struggled to find peers he could relate to. “I remember he was really hoping that that'd change when he went to Stanford,” says Seth Schoen. “And I visited him there and he basically said that it hadn’t—that even the other Stanford undergrads around him weren’t curious about the things he was curious about.”
It’s probably more that they weren’t curious in the same way he was curious—that they accepted Stanford at face value and got the most out of it they could rather than questioning everything it stood for. Still, he tried to make Stanford work. In 2005, he helped launch the Roosevelt Institute Campus Initiative, a group pushing to get college students involved in politics. But he became increasingly disengaged from school, spending more time off campus with people who worked in political and data activism. He also supplemented his studies with a lot of outside reading. “He read more fiction as an 18-year-old computer genius than I read [now] as a creative writing grad student,” remembers Kat Lewin, one of Swartz’s dorm-mates.
The summer before he entered Stanford, Swartz read two books that changed his worldview. Moral Mazes, which he would later call his all-time favorite book, is an ethnographic study of American corporate managerial culture. In it, Robert Jackall examines the institutional logic of the corporate world, and explains how diffused responsibility and organizational insularity create a culture that rewards managers for doing the wrong thing. In Understanding Power, which treads similar ground, linguist and political activist Noam Chomsky expounds on how power structures lead good people to do horrible things.
Both books are indictments of bureaucracies—of how giant organizations harm outsiders who come into contact with them and those insiders who refuse to play the game. For someone like Swartz, predisposed to resist feeling like a cog in a machine, this was the intellectual justification he needed to kiss off the industrial education complex. Like North Shore Country Day School before it, Stanford never stood a chance.
Swartz took the first out he could find. It came courtesy of essayist and entrepreneur Paul Graham, who’d founded a company called Y Combinator in 2005. Graham, who made millions selling his company Viaweb to Yahoo, believed that smart and restless talents like Swartz ought to quit school and start building things. He invited budding entrepreneurs to send him proposals for tech startups; he’d pick the ones he liked and bring the founders to Cambridge, Mass., for a summer’s worth of bootstrapping.
Swartz pitched Graham something called Infogami, a platform that would help people build structured, data-driven, content-rich websites. It was a logical conceptual progression from the Semantic Web projects Swartz had been steeped in, and it became one of eight businesses to be funded that year. Not long after Mark Zuckerberg left Harvard to marinate in Palo Alto’s start-up culture, Swartz made the opposite move, ditching California after just one year to build a startup on the East Coast. His college career was over.
* * *
TODAY IN SLATE
One of the most amazing feats in chess history just happened, and no one noticed.
The Extraordinary Amicus Brief That Attempts to Explain the Wu-Tang Clan to the Supreme Court Justices
Amazon Is Officially a Gadget Company. Here Are Its Six New Devices.
Do the Celebrities Whose Nude Photos Were Stolen Have a Case Against Apple?
The NFL Explains How It Sees “the Role of the Female”
Amazon Is Now a Gadget Company
How to Order Chinese Food
First, stop thinking of it as “Chinese food.”