Books

The Geek Freaks

Why Jaron Lanier rants against what the Web has become.

Jaron Lanier’s You Are Not a Gadget has one of the more sobering prefaces to be found in recent books. “It’s early in the twenty-first century, and that means that these words will mostly be read by nonpersons,” it begins. The words will be “minced into anatomized search engine keywords,” then “copied millions of times by some algorithm somewhere designed to send an advertisement,” and then, in a final insult, “scanned, rehashed, and misrepresented by crowds of quick and sloppy readers.” Lanier’s conclusion: “Real human eyes will read these words in only a tiny minority of the cases.” My conclusion: Is that really such a bad thing?

Lanier is best known as a pioneer of virtual reality and an early star of Wired magazine. He was the guy with the dreadlocks and the giant V.R. goggles perched on his forehead, the epitome of the hippie-shaman-guru strain in tech culture. In what may have been a high point, Lanier’s V.R. glove was used to power the graphics in a Grateful Dead video. Lanier lost his company in the early ‘90s in a then-legendary flameout, and he has been working in the seams of academia and Silicon Valley ever since. He’s the barefoot guy in the conference room, ever creative, childlike.

You Are Not a Gadget is basically a collection of his Internet columns and postings, bound, set into type, and called a “manifesto.” Over the years, Lanier has become a skeptic of that amorphous thing called Web 2.0. He directs most of his ire toward the “anonymous blog comments, vapid video pranks, and lightweight mashups” that flit through our browsers and Twitter feeds. But he’s also critical of bigger Internet landmarks, such as Wikipedia, the open-source software Linux, and the “hive mind” in general.

It would be fitting to rue Lanier’s fate as mere sausage for search algorithms if he had organized his opinions into a coherent thesis. The reality is that Lanier’s stimulating, half-cocked ideas are precisely the kind of thinking that gets refined and enlarged on vibrant Web places like Marginal Revolution, Boing Boing, and MetaFilter. Lanier maintains, for example, that musical development has essentially stalled. He has a challenge: “[P]lay me some music that is characteristic of the late 2000s as opposed to the late 1990s.” Lanier claims that listeners can’t distinguish between recent musical eras because music is “retro, retro, retro.” I would like to see that debate play out in the columns on Pitchfork. * Being scanned and rehashed in a blog post somewhere will be the best thing that ever happened to some of these words.

That is mostly because Lanier is an unreconstructed geek who throws around terms like realistic computationalism and numinous neoteny, which make your ears hurt. He will spend a few pages bemoaning the fact that a “locked-in” technology such as the computer “file” has cut off other, potentially more beautiful ways of organizing information on a computer.

As near as I can make it out, Lanier’s view is that the Web began as a digital Eden. We built homepages by hand, played around in virtual worlds, wrote beautiful little programs for the fun of it, and generally made our humanity present online. The standards had not been set. The big money and the big companies had not yet arrived. Now Google has linked search to advertising. The Internet’s long tail helps only the Amazons of the world, not the little guys and gals making songs, videos, and books. Wikipedia, a mediocre product of group writing, has become the intellectual backbone of the Web. And, most depressingly, all of us have been lumped into a “hive mind” that every entrepreneur with a dollar and a dream is trying to parse for profit.

Yet, just when you’re about to sigh and go check your Twitter feed, Lanier writes something that gives you pause. On who really benefits from Facebook, for instance:

The real customer is the advertiser of the future, but this creature has yet to appear at the time this is being written. The whole artifice, the whole idea of fake friendship, is just bait laid by the lords of the clouds to lure hypothetical advertisers—we might call them messianic advertisers—who might someday show up.

A touch overblown, but we can easily forget that Facebook needs to build a profit with our friendships. Our favorite distraction awaits a Messiah that will justify its billion-dollar valuation. “The only hope for social networking sites from a business point of view,” Lanier writes, “is for a magic formula to appear in which some method of violating privacy and dignity becomes acceptable.” Have you checked your privacy settings recently? Lanier has been proven prophetic.

Like others who have been on the Web from its early days, Lanier thinks the place has “lost its flavor.” Perhaps homepages in the mid-’90s did have a folk-art quality to them, though one heavily dominated by Simpsons and Star Trek references. Perhaps our regimented Facebook selves have made things more vanilla. Perhaps you did stumble down more idiosyncratic paths of knowledge before Wikipedia dominated the top Google search results. But these are the kinds of nostalgic observations that are ridiculous to anyone young. The Web hasn’t lost flavor; you’ve lost flavor. What Samuel Johnson said about his hometown holds true for the Internet: “No, Sir, when a man is tired of London, he is tired of life; for there is in London all that life can afford.”*

In addition to the general standardization and corporatization of the Web, Lanier sees the Web’s “open culture” as a failure. Instead of creating new songs or videos, we just steal from the previous decades of pop culture and create parodies and mashups. Instead of writing brilliant new computer programs, computer jocks toil at improving the free, open-source Linux, which offers no real innovation over the decades-old Unix. His best and most comprehensible critique of how the Web has smothered creativity involves what you could call the Ani DiFranco problem:

In the old days—when I myself was signed to a label—there were a few major artists who made it on their own, like Ani DiFranco. She became a millionaire by selling her own CDs while they still were a high-margin product people were used to buying, back before the era of file sharing. Has a new army of Ani DiFrancos started to appear?

In Lanier’s eyes, there is no longer a middle realm in which musicians can make music according to their own standards, sell it directly to fans, and not starve. Musicians are either kids in vans making just enough money for the next gig or dilettantes with a vanity career. The Facebook generation gets its music for free and doesn’t expect to pay for it, and this has helped bring about a musical Dark Age. That’s not a crazy idea, but it’s just Lanier’s hunch. When you start to poke around for data, you get a sense of the landscape. According to this U.K. study, artists now make the majority of their money doing live performances, and the total revenue accrued by artists has increased. Today’s theoretical middle-class musician would probably have to travel more, but he or she could still make a living.

There’s also the problem of the counterexample: What great artist has been left unrecognized by the Internet? Who hasn’t found a niche? Lanier, to his credit, is not a simple pessimist. He does propose a solution to the difficulty of how to compensate artists, artisans, and programmers in a digital era: a content database that would be run by some kind of government organization: “We should effectively keep only one copy of each cultural expression—as with a book or song—and pay the author of that expression a small, affordable amount whenever it’s accessed.” Again, not a bad concept, but a platonic idea that sounds great in theory. I don’t see the government opening an iTunes store anytime soon.

Lanier is a survivor and has good instincts: We need to be wary of joining in the wisdom of the crowds, of trusting that open collaboration always produces the best results, of embracing the growing orthodoxy that making cultural products free will benefit the actual producers of those cultural products. But his critique is ultimately just a particular brand of snobbery. Lanier is a Romantic snob. He believes in individual genius and creativity, whether it’s Steve Jobs driving a company to create the iPhone or a girl in a basement composing a song on an unusual musical instrument.

The problem is that the Web is much bigger now, and both Jobs and the bedroom oud player must, in their own ways, strive for attention from the hive mind. And the results can arrive like lightning: Just a few weeks ago, a man in Uruguay was given a $30 million dollar movie deal after posting a sci-fi short on YouTube. No one likes to become obsolete or cranky, but my sense is that Lanier doesn’t want to play on this new field. The talents and insights of Lanier and his peers were aimed at a tech-savvy elite whose impact will never be the same again. The innovative momentum is now about democratizing the Web and its uses—Flickr, Twitter, and, yes, Facebook. It was a lot of fun at the beginning, but virtual reality has moved on. It’s time to take off the goggles and gloves, and join us here on Earth.

Corrections, Jan. 4, 2009: This article originally stated incorrectly that Samuel Johnson wrote the quoted phrase about London. In fact he spoke the quoted phrase in conversation with his biographer, James Boswell. ( Return to the corrected sentence.)

The article originally suggested that there were “forums” on Pitchfork. While Pitchfork reviews and articles are linked to and debated around the Internet, the site itself does not host user forums. (Return to the corrected sentence.)

Become a fan of Slate on Facebook. Follow us on Twitter.