Future Tense

A Robot Stole My Pulitzer!

How automated journalism and loss of reading privacy may hurt civil discourse.

Journalist with typewriter
Automated journalism like that produced by Narrative Science could perhaps save media jobs, but it can also hurt civil discourse

William Gottlieb/Library of Congress.

Can technology be autonomous? Does it lead a life of its own and operate independently of human guidance? From the French theologian Jacques Ellul to the Unabomber, this used to be widely accepted. Today, however, most historians and sociologists of technology dismiss it as naive and inaccurate.

Yet the world of modern finance is increasingly dependent on automated trading, with sophisticated computer algorithms finding and exploiting pricing irregularities that are invisible to ordinary traders.

Meanwhile, Forbes—one of financial journalism’s most venerable institutions—now employs a company called Narrative Science to automatically generate online articles about what to expect from upcoming corporate earnings statements. Just feed it some statistics and, within seconds, the clever software produces highly readable stories. Or, as Forbes puts it, “Narrative Science, through its proprietary artificial intelligence platform, transforms data into stories and insights.”

Don’t miss the irony here: Automated platforms are now “writing” news reports about companies that make their money from automated trading. These reports are eventually fed back into the financial system, helping the algorithms to spot even more lucrative deals. Essentially, this is journalism done by robots and for robots. The only upside here is that humans get to keep all the cash.

Narrative Science is one of several companies developing automated journalism software. These startups work primarily in niche fields—sports, finance, real estate—in which news stories tend to follow the same pattern and revolve around statistics. Now they are entering the political reporting arena, too. A new service from Narrative Service generates articles about how the U.S. electoral race is reflected in social media, what issues and candidates are most and least discussed in a particular state or region, and similar topics. It can even incorporate quotes from the most popular and interesting tweets into the final article. Nothing covers Twitter better than the robots.

It’s easy to see why Narrative Science’s clients—the company says it has 30—find it useful. First of all, it’s much cheaper than paying full-time journalists who tend to get sick and demand respect. As reported in the New York Times last September, one of Narrative Science’s clients in the construction industry pays less than $10 per 500-word article—and there is no one to fret about the terrible working conditions. And that article takes only a second to compose. Not even Christopher Hitchens could beat that deadline. Second, Narrative Science promises to be more comprehensive—and objective—than any human reporter. Few journalists have the time to find, process, and analyze millions of tweets, but Narrative Science can do so easily and, more importantly, instantaneously. It doesn’t just aim to report fancy statistics—it attempts to understand what those numbers mean and communicate this significance to the reader. Would Narrative Science have unmasked the Watergate? Probably not. But then most news stories are easier to report and decipher.

Narrative Science’s founders claim that they simply want to help—not exterminate!—journalism, and they may very well be sincere. Reporters are likely to hate their guts, but some publishers, ever concerned with paying the bills, would surely embrace them with open arms. In the long run, however, the civic impact of such technologies—which are only in their infancy today—may be more problematic.

If there is one unambiguous trend in how the Internet is developing today, it’s the drive toward the personalization of our online experience. Everything we click, read, search, and watch online is increasingly the result of some delicate optimization effort, whereby our previous clicks, searches, “likes,” purchases, and interactions determine what appears in our browsers and apps.

Until recently, many Internet critics have feared that such personalization of the Internet may usher in a world in which we see only articles that reflect our existent interests and never venture outside of our comfort zones. Social media, with its never-ending flurry of links and mini-debates, have made some of these concerns obsolete. But the rise of “automated journalism” may eventually present a new and different challenge, one that the excellent discovery mechanisms of social media cannot solve yet: What if we click on the same link that, in theory, leads to the same article but end up reading very different texts?

How will it work? Imagine that my online history suggests that I hold an advanced degree and that I spend a lot of time on the websites of the Economist or the New York Review of Books; as a result, I get to see a more sophisticated, challenging, and informative version of the same story than my USA Today-reading neighbor. If one can infer that I’m also interested in international news and global justice, a computer-generated news article about Angelina Jolie might end by mentioning her new film about the war in Bosnia. My celebrity-obsessed neighbor, on the other hand, would see the same story end with some useless gossipy tidbit about Brad Pitt.

Producing and tweaking stories on the spot, customized to suit the interests and intellectual habits of just one particular reader, is exactly what automated journalism allows—and why it’s worth worrying about. Advertisers and publishers love such individuation, which could push users to spend more time on their sites. But the social implications are quite dubious. At the very least, there’s a danger that some people might get stuck in a vicious news circle, consuming nothing but information junk food and having little clue that there is a different, more intelligent world out there. And the communal nature of social media would reassure them that they aren’t really missing anything. Naturally, it can also be the next step in the evolution of much-hated content farms like Demand Media.

Consider what might happen if, as seems likely, big technology companies enter this business and displace small players like Narrative Science. Take Amazon. Its Kindle e-reader allows users to look up unknown words in the electronic dictionary and underline their favorite sentences; Amazon records and stores such information on its servers. This would come in handy if (when?)  Amazon decides to build a personalized and fully automated news digest: After all, it knows what newspapers I read, what kinds of articles attract my attention, what sentences I tend to like, and what words I find puzzling. And I already own their device, where I can read such news digests—for free!

Or consider Google. Not only does it know my information habits better than anyone—even more so with its recently unified privacy policy—but it also operates Google News, a sophisticated news aggregator, which gives it superb analytical insight into current affairs. Thanks to its highly popular Google Translate service, it also knows how to piece sentences together.

Given all this, the idea that greater automation could save journalism seems short-sighted. However, innovators like Narrative Science are not to blame; used narrowly, their technologies may actually save costs and perhaps even allow some journalists—provided they can keep their jobs!—to pursue more interesting analytical projects rather than rewrite the same story every week.

The real threat comes from our refusal to investigate the social and political consequences of living in a world where reading anonymously becomes a near impossibility. It’s a world that advertisers—along with Google, Facebook, and Amazon—can’t wait to inhabit, but it’s also a world where critical, erudite and unconventional thinking may become harder to nurture and preserve.  

This article arises from Future Tense, a collaboration among Arizona State University, the New America Foundation, and Slate. Future Tense explores the ways emerging technologies affect society, policy, and culture. To read more, visit the Future Tense blog and the Future Tense home page. You can also follow us on Twitter.