Robot Reporters Need Some Journalistic Ethics

What's to come?
April 2 2014 7:27 AM

Bots on the Beat

How can we instill journalistic ethics in robot reporters?

journalism ethics.
Journalists earn their audience’s respect through diligence, ethical decision-making, and transparency.

Photo by Digital Vision/Thinkstock

On March 24 at 2:36 p.m., the New York Post reported that the body of former Nets basketball player Quinton Ross had been found in a shallow grave on Long Island. A few minutes later, the paper corrected the story to indicate the victim was a different man by the same name. But it was already too late. Other news sources had picked it up—including a robot journalist.

Created by Google engineer Thomas Steiner, Wikipedia Live Monitor is a news bot designed to detect breaking news events. It does this by listening to the velocity and concurrent edits across 287 language versions of Wikipedia. The theory is that if lots of people are editing Wikipedia pages in different languages about the same event and at the same time, then chances are something big and breaking is going on.

At 3:09 p.m. the bot recognized the apparent death of Quinton Ross (the basketball player) as a breaking news event—there had been eight edits by five editors in three languages. The bot sent a tweet. Twelve minutes later, the page’s information was corrected. But the bot remained silent. No correction. It had shared what it thought was breaking news, and that was that. Like any journalist, these bots can make mistakes.


To mark the growing interest in online news and information bots, the Knight-Mozilla OpenNews project deemed last week “botweek.” Narrative Science and Automated Insights run robot journalists that produce sports or finance stories straight from the data, some of which are indistinguishable from human-written stories. The Quake Bot from the Los Angeles Times made news recently, including here at Slate, by posting the first news report about a 4.7-magnitude earthquake that hit the area. Twitter bots like NailbiterBot or the New York Times 4th Down Bot produce newsy sports tweets to entertain and engage, while the TreasuryIO bot keeps a watchful eye on U.S. federal spending. Researcher Tim Hwang at the Data & Society Research Institute is looking at how to use bots to detect misinformation on social networks and target those people, or others around them, to try to correct it.

As these news bots and the algorithms and data that run them become a bigger part of our online media ecosystem, it’s worth asking: Can we trust them? Traditionally, journalists have earned their audience’s respect through diligence, ethical decision-making, and transparency. Yet computer algorithms may be entirely opaque in how they work, necessitating new methods for holding them accountable.

Let’s consider here one value that a robot journalist might embody as a way of building trust with its audience: transparency. What would a standard transparency policy look like for such a bot?

At its core, transparency is a user-experience problem. Some have argued, and rightly so, for the ability to file a Freedom of Information Act request for the source code of algorithms used by the government. Some tweet-spewing news bots are open-sourced already, like TreasuryIO. But source code doesn’t really buy us a good user experience for transparency. For one thing, it takes some technical expertise to know what you’re looking at. And as a programmer, I find it challenging to revisit and understand old code that I myself have written, let alone someone else. Furthermore, examining source code introduces another bugaboo: versions. Which version of the source code is actually running the Twitter bot?

No, at the end of the day, we don’t really want source code. We want to know the editorial biases, mistakes, and tuning criteria of these bots as they are used in practice—presented in an accessible way. Usable transparency demands a more abstract and higher-level description of what’s important about the bot: more “restaurant inspection score”–style than unreadable spaghetti code.



Forget Oculus Rift

This $25 cardboard box turns your phone into an incredibly fun virtual reality experience.

Republicans Want the Government to Listen to the American Public on Ebola. That’s a Horrible Idea.

The 2014 Kansas City Royals Show the Value of Building a Mediocre Baseball Team

The GOP Won’t Win Any Black Votes With Its New “Willie Horton” Ad

Sleater-Kinney Was Once America’s Best Rock Band

Can it be again?


Smash and Grab

Will competitive Senate contests in Kansas and South Dakota lead to more late-breaking races in future elections?

I Am 25. I Don’t Work at Facebook. My Doctors Want Me to Freeze My Eggs.

These Companies in Japan Are More Than 1,000 Years Old

  News & Politics
The World
Oct. 21 2014 11:40 AM The U.S. Has Spent $7 Billion Fighting the War on Drugs in Afghanistan. It Hasn’t Worked. 
Oct. 21 2014 1:12 PM The Global Millionaires Club Is Booming and Losing Its Exclusivity
Atlas Obscura
Oct. 21 2014 12:40 PM Asamkirche: The Rococo Church Where Death Hides in Plain Sight
  Double X
The XX Factor
Oct. 21 2014 1:12 PM George Tiller's Murderer Threatens Another Abortion Doctor, Claims Right of Free Speech
  Slate Plus
Behind the Scenes
Oct. 21 2014 1:02 PM Where Are Slate Plus Members From? This Weird Cartogram Explains. A weird-looking cartogram of Slate Plus memberships by state.
Oct. 21 2014 12:05 PM Same-Sex Couples at Home With Themselves in 1980s America
Oct. 21 2014 10:43 AM Social Networking Didn’t Start at Harvard It really began at a girls’ reform school.
  Health & Science
Climate Desk
Oct. 21 2014 11:53 AM Taking Research for Granted Texas Republican Lamar Smith continues his crusade against independence in science.
Sports Nut
Oct. 20 2014 5:09 PM Keepaway, on Three. Ready—Break! On his record-breaking touchdown pass, Peyton Manning couldn’t even leave the celebration to chance.