Intelligence Squared

Silicon Valley, Serendipity, and Cats

Slate/Intelligence Squared debater Evgeny Morozov explains the Internet.

Evgeny Morozov.

Evgeny Morozov will argue that the Internet does not close political minds in the next Intelligence Squared debate

Photograph by Daniel Seiffert.

Evgeny Morozov is author of The Net Delusion: The Dark Side of Internet Freedom, which deals with the sometimes troubling intersections of politics and pixels. He’s also a visiting fellow at Stanford University’s Liberation Technology program and a Schwartz fellow at the New America Foundation.

Though Morozov has plenty of reservations about Google, Facebook, et al, at the Slate/Intelligence Squared live debate on April 17, he’ll join Jacob Weisberg in arguing that the Internet is not hemming us into ideological ghettoes, especially where politics is concerned. Morozov wants to redirect the conversation from illusory “filter bubbles” to the greed—and potentially fallible algorithms—coming out of Silicon Valley. I recently emailed with the Belarusian émigré about Jay-Z, cats, online serendipity, and how apps are changing Web surfing.

Excerpts of our conversation:.

Slate: You’ll argue that the Internet doesn’t constrict our political field of vision.

Evgeny Morozov: The objective of our debate, as I understand it, is to assess to what extent the Internet is making it harder for citizens to find information that contradicts their existing viewpoints, primarily because Google now personalizes its search results and Facebook filters what news updates from our online friends we get to see. My response here is threefold. First, we don’t conclusive evidence that any such “closing of the American mind” is happening. Second, even if it were happening, we don’t have good evidence that it’s Internet-driven. Third, I think that “filtering” is a normal response to a complex problem; libraries are filters, so are newspapers. I find it hard to accept the premise that citizens ought to read every single Facebook message posted by their friends—or they risk being politically uninformed. Now, of course, there are ways to get it wrong—and Google and Facebook have a mixed record here—but I think we should think twice before we attack the very idea of “filtering.” 

Slate: What does “getting it wrong” mean? What did Facebook and Google do? And where do they get it right?

Morozov: I primarily meant the lack of transparency and user control. Facebook has deviated between offering no control and insight into how it customizes news flow to offering more and more of it. The same goes for Google: Turning off personalization used to be harder than it is now. There is a clearly visible button now on every search page that allows you to turn off all personalization. 

Slate: There’s a sense with Google personalization that we no longer have to work as hard to access our ideal results. Has Web browsing become a more passive experience, and is that a bad thing?

Morozov: The Web may have certainly become a more passive experience—and it’s probably far less private than it used to be—but this is not the consequence of personalization or filtering. It’s OK to hate Google and Facebook but we should do it for the right reasons. 

Slate: What are the right reasons to hate Google and Facebook?

Morozov: I’ll pass on this question. I don’t see this debate as some kind of macro-evaluation of the Internet’s goodness. 

Slate: What is the new passivity a consequence of, then? 

Morozov: Well, the rise of apps is definitely a factor. The growing use of geo-location—as we use the Web from our mobile devices—is another. The collection of information about what we like and what we are is yet a third. All of these reduce the effort needed to do things both offline and online. 

Slate: In The Filter Bubble, Eli Pariser writes, “Google is great at helping us find what we know we want, but not at finding what we don’t know we want.” Do you agree? 

Morozov: Well, I find that statement a bit utopian because it’s usually followed by the demand that Google starts telling us things we don’t already know, so that whenever we search for Jay-Z, we are also prompted to do something about Joseph Kony. There may be occasions where a more interventionist attitude from Google is required—I made a provocative case in Slate a while ago that public health (and especially vaccination-related) decisions may be one such occasion—but to assume that this needs to happen on a universal scale, with Google taking on the role of a global enforcer of cosmopolitanism and “caring,” well, that I find very naive. 

Slate: Except presumably we don’t actually want to do something about Joseph Kony, right? We just feel we should. But what about the argument that personalization makes it harder for us to serendipitously stumble across things we might like?

Morozov: As I pointed out in my review of Eli Pariser’s book, one shouldn’t confuse serendipity with randomness. Take this example from real life: I love watching history lectures online—mostly on YouTube. YouTube knows this. Now every time I come to YouTube it shows me these history lectures instead of silly videos of cats or Hollywood-related videos that may be popular with other users. Has YouTube shown me cool history videos I wouldn’t have discovered on my own? Sure, it has. Of course, I may be an outlier. But the idea that serendipity happens only to us if we start with some kind of a “blank state,” where the system knows nothing about us, that idea I find hard to believe in. As Pasteur said, “chance favored the prepared mind.” Customization can often help us with the preparation.

Slate: Does it disturb you that the main curators of human knowledge are now profit-driven companies? Are there any good alternatives?

Morozov: Well, the only alternative here is to rely on the state to provide the same services: email, search, social networking and so on. I hear they are trying to do something along those lines in that oasis of tolerance and understanding, Iran. Let’s wait and see how they pull it off. On a more serious note, I do accept the argument that there are contexts and activities—digital libraries come to mind—where the state would probably be a more reliable provider than the private sector, as the latter has a very different incentive structure. But this doesn’t have much to do with information infrastructure per se but rather with the thorny issues of copyright.

Slate: Personally, when it comes to information gatekeepers, do you prefer codes or people?

Morozov: Let me turn the tables: When it comes to getting around, do you prefer walking or driving/taking public transport? Obviously, there’s space for both. Do we know of ill-thought algorithms that might end up feeding us very narrow views? Sure. Do we know narrow-minded columnists or bloggers who do the same? Yes. Both are information gatekeepers. What I think we need to do is to treat algorithms with the same critical stance that we treat human gatekeepers, for, ultimately, algorithms don’t normally write themselves—they are human creations. 

Slate: Your debate opponent, Siva Vaidhyanathan, claims that one way the Internet narrows us in politics is through techno-narcissism, or the self-serving belief that our digital toys and obsessions (Twitter, Facebook, BlackBerrys) make a real difference in current events. He thinks techno-narcissism leads us to concentrate, say, on the role of social media in organizing protests, rather than the deep, underlying issues behind the protests.

Morozov: Great! Can we do the same with the subject of our debate and instead of focusing on the personalization algorithms, look at some of the “deep, underlying issues” shaping the Internet? In my book, The Net Delusion, I actually discuss this problem at length (I call it “Internet-centrism” though)—yes, there’s a tendency to put the Internet and associated technologies at the center of all explanation, whether it’s explaining the latest wave of democratic uprisings or the greatest threat to mankind and democracy. Why does Facebook employ filters? Well, because they are making us share more with their “frictionless sharing” crusade and without filters users would use the site less. Why do they do it? The more they know about us, the more they can make in advertising revenues. So maybe what we are talking about is not all about the algorithms or even the “Internet” (a notion that I increasingly find unhelpful in explaining today’s world) but about the greed of Silicon Valley? 

Many of the companies in Silicon Valley are run by venture capitalists who are as wild about capitalism as your average Viacom investor. The problem with the Silicon Valley crowd is they are so caught up in their own techno-fetishism that they often see a quasi-religious/spiritual dimension to their work. This I find unhealthy. 

Slate: What is your favorite thing about the Internet?

Morozov: Cats.

Slate: Least favorite?

Morozov: Cats.