Stephen, Laura, and Chris,
Chris made a crucial point earlier: The most popular television shows are rarely what reviewers or critics bother writing about. Most reviewers would rather critique the latest episode of Mad Men because, really, what else can possibly be said about The Big Bang Theory? Moreover, Big Bang hardly needs anybody to champion it. It makes tons of money. Nobody is going to change his mind about watching that show. No one is going to say, “Oh, there’s a little show called Big Bang Theory, maybe I’ll give that one a chance, I wonder if the first season is on Netflix.”
I also strongly doubt even the most fanatic Big Bang devotee would have any interest in a thousand-word critique of his favorite show each week, so even if a television critic were to write about it in good faith (“My Year of Watching The Big Bang Theory”—now there’s a Slate pitch!), who would ever read it? (I shouldn’t be so quick to generalize; there’s definitely a guy. There’s always a guy.)
But just as Chris observes, there exists a broad, ever-widening gulf separating the types of television shows, movies, and video games that critics enjoy discussing, versus what everybody else is actually watching (or playing, in games’ case). It’s the same gulf that divides network television from “prestige” television, or art films from Michael Bay movies. But it’s like, right, we don’t expect A.O. Scott to give every Michael Bay movie a glowing review.
In video games, this cultural gap—this rift between what readers enjoy playing and what many writers enjoy writing about and, indeed, the type of writing most outlets are willing to pay writers for—first manifested around 2007. Or at least, that was when I finally noticed a lot of my co-workers at Ziff Davis Media becoming, independently but simultaneously, exhausted or flat-out bored by a lot of upcoming triple-A titles. Those games were fine and good, but our print-media coverage was growing predictable, rote. Many of us became much more interested in whatever was happening outside our own little ecosystem—resulting in a seeming uptick in the press’s attention in “indie games.” We didn’t conspire to do this. We didn’t collude. It just happened.
And then something else happened: Print media died. Our magazines were just too slow. We were constantly scooped right and left by online media—“the little blogs are winning,” I lamented when Electronic Gaming Monthly shuttered in 2009—and advertisers had finally decided their dollars went further online anyway. Those “enthusiast press” periodicals had once functioned as the gatekeepers, as the arbiters of taste. And then those magazines vanished one by one.
Which brings us to 2014.
The subsequent cultural free-for-all—a sudden incursion of online writers and bloggers, every byline seemingly as credible as the next one—has permanently leveled the landscape. Suddenly there is a whole lot more to write about, and also many more writers to do it. Could games like Gone Home or Kentucky Route Zero have succeeded during the days of print media? Could they have even existed before the death of disc-based media and the rise of digital distribution?
Almost certainly, the answer is no—just as television as artful as Mad Men and as weird as “Too Many Cooks” could not have existed before premium cable and Hulu Plus, could not have succeeded before audiences collectively ditched DVDs for tablets and laptop screens. New business models have led to new audiences, new critics, and—because all things are bizarrely recursive—new types of television shows and video games coming down the pipeline.
Gamergate rails against many of these sea changes. It bills itself as a “consumer advocacy movement,” but this claim really only works if you cannot believe that reviews—the type of writing you might’ve seen, at a time, in TV Guide—are distinct from games criticism, the likes of which A.O. Scott might write as a movie critic. One of Gamergate’s primary sticking points is that games critics don’t write for “gamers” anymore—and that’s often true! That's exactly why Leigh Alexander’s late-August column “ ‘Gamers’ don’t have to be your audience. ‘Gamers’ are over” caused offense to so many people. Colloquially referred to as the “the Death of Gamers” piece (although, confusingly, there were multiple), Alexander’s article actually only suggested that there are, by now, many different audiences for many different types of work. There is no single audience anymore, Alexander opined, and perhaps there never really was. The “gamer” identity itself is illusory, her piece suggested—a byproduct of so much demographic marketing.
Meanwhile, the audience most alienated by Alexander’s op-ed was, a little humorously, comprised of the same sort of folks who appear in the comments section of any article about “representation” or “diversity” (and I’ve put those words in scare quotes because they are, in fact, total buzzkills) only to chant “Create your own industry!” at critics. Alexander’s point—maybe her most inflammatory one—is that many in the industry already have. It is very painful for people when they intuit, whether as readers or as players, that something was possibly not intended for them. No one likes to feel that he or she is unwelcome.
So what is Gamergate? I’d originally considered, in making my Top 10 list of 2014 games, placing it somewhere near the top, because by now Gamergate has become one of the most expansive (and by many accounts, expensive) works of intertextuality to date. Gamergate borrows its narrative techniques from both ARGs and grassroots online activism; participants collaborate on intricate networked narratives, connecting both influential and incredibly tertiary industry figures to corporate interests big and tiny. The authors of Gamergate’s oral history are a decentralized collective, a conglomerate of modern-day bards with no single leader, who use real-world details from the lives of real-world people to give the mythos texture, verisimilitude and, probably most important, real-world stakes. This is literally what we mean by “gamification”! (It’s also worth noting that Gamergate, alas, cannot be paused.)
At the beginning, I was too close to it to really even understand it, I think. In my first-ever Guardian editorial, I wrote that I was baffled to have become a target of hacking attempts, and for reasons not immediately discernible to me. In the aftermath of that same piece’s publication, I couldn’t quite grok how my integrity could be in question. In both cases I wrestled to comprehend how knowing just one person, not very well and among the thousands of people I do know, could’ve ever become the source of any controversy at all.
One of the most interesting, if galling, things about Gamergate is that, if I do ever discuss it, I must also disclose my “involvement”—because ethics—and this need to recap my own experience only perpetuates Gamergate itself, only galvanizes its narrative, giving the whole thing further memetic power. Conversely, choosing not to discuss Gamergate is also not an option—again, because ethics—since the “-gate” suffix was itself first appended, really, as an indictment of the industrywide reluctance to discuss any of it at the start. So it’s a frustrating catch-22, wherein Gamergate is tautologically important simply because ignoring it is an ethical offense. (But discussing it might also be some sort of ethical violation, and the circular logic of the whole thing is constructed explicitly to breed out any dissent.)
For better or worse, Gamergate is a culturally significant moment. The games industry has mythologized itself for decades now, through printed monthly circulars and, more recently, via the Web. Small wonder, then, that a collective would indict that perceived insularity, would deconstruct the entire industry—painfully, often indiscriminately—one individual actor at a time.
Whether Gamergate actually accomplished any good is a separate matter, far beyond the reach of this dispatch. But it’s made its stamp nonetheless, and academics from myriad disciplines—ethnographers, Internet historians, anyone who works in the so-named “digital humanities”—will exhaust themselves trying (helplessly) to study and quantify it, and for years to come.