Google's image recognition software returns some surprisingly racist results.

Google Scrambles After Software IDs Photo of Two Black People as “Gorillas”

Google Scrambles After Software IDs Photo of Two Black People as “Gorillas”

Future Tense
The Citizen's Guide to the Future
June 30 2015 12:13 PM

Google Scrambles After Software IDs Photo of Two Black People as “Gorillas”

Google Photos director Anil Sabharwal announces Google Photos.
Google Photos director Anil Sabharwal announces Google Photos during the 2015 Google I/O conference on May 28, 2015 in San Francisco, California.

Photo by Justin Sullivan/Getty Images

Image recognition software is still in its infancy. Sometimes that means it’s a little silly, as when Wolfram Alpha’s algorithms confuses cats with sharks or goats with dogs. Sometimes it’s a little creepy, as it was when Facebook announced that it can identify you even if your face isn’t showing. And sometimes it’s just really, really icky.

When Brooklyn-based computer programmer Jacky Alciné looked over a set of images that he had uploaded to Google Photos on Sunday, he found that the service had attempted to classify them according to their contents. Google offers this capability as a selling point of its service, boasting that it lets you “search by what you remember about a photo, no description needed.” In Alciné’s case, many of those labels were basically accurate: A photograph of an airplane wing had been filed under “Airplanes,” one of two tall buildings under “Skyscrapers,” and so on.


Then there was a picture of Alciné and a friend. They’re both black. And Google had labeled the photo “Gorillas.” On investigation, Alciné found that many more photographs of the pair—and nothing else—had been placed under this literally dehumanizing rubric.

“Google,” Alciné tweeted, “y’all fucked up.” To their credit, Google employees responded quickly. Yonatan Zunger, who works as the company’s chief architect of social, responded to Alciné’s tweet, writing, “This is 100% Not OK.” In subsequent tweets, Zunger explained that he had reached out to the Photos team and that it was working on a fix that evening.

According to Alciné’s Twitter feed, the problem remained in place even after the supposed fix had been implemented. Ultimately, Google applied a secondary solution, reworking the system so that it wouldn’t tie photos to the “Gorilla” tag at all. Zunger writes that it is also working to develop a number of “longer-term fixes,” including identifying “words to be careful about in photos of people” and “better recognition of dark skinned faces.”

While Google’s efforts to solve this problem are admirable, it’s still troubling that it happened at all. As Alciné wrote on Twitter, “I understand HOW this happens; the problem is moreso on the WHY.” 

Future Tense is a partnership of SlateNew America, and Arizona State University.