Future Tense

Facebook Is Really Sorry About Messing With Your Emotions

We’re OK, right?

Photo by Justin Sullivan/Getty Images

Facebook announced Thursday that it will give its research standards and practices a facelift. What remains to be seen: whether the company’s proposed changes are meaningful or just cosmetic.

In June, as most Internet users can’t help knowing, Facebook released a paper explaining that it had manipulated the newsfeeds of nearly 700,000 members without their knowledge or consent. Researchers were investigating emotional contagion, or the idea that the happiness or sadness evinced by our friends’ posts might shade the affective tenor of our own posts. So they showed certain people more positive status updates and certain people gloomier ones, and took note of how that changed the way the unwitting test subjects described their lives on Facebook. Amid impassioned blowback, Facebook’s defenders argued that any experimental effects on people’s well-being were vanishingly slight, or that firms meddle with consumers’ feelings all the time. Yet the predominant reaction was white-hot rage.

After contrite blog posts from the researchers themselves (“Our goal was never to upset anyone,” wrote Adam D.I. Kramer. “I can understand why some people have concerns about it, and my coauthors and I are very sorry for the way the paper described the research and any anxiety it caused. In hindsight, the research benefits of the paper may not have justified all of this anxiety”), Facebook Chief Technology Officer Mike Schroepfer has published a statement. “We’re committed to doing research to make Facebook better,” he said, “but we want to do it in the most responsible way.” Schroepfer asserted that the emotional contagion study was valuable because it addressed the fear that users constantly exposed to their friends’ triumphs and celebrations might feel bad about their lives. (The results also, of course, helped Facebook officers understand how to keep people happy and on Facebook.) But “we should have considered other non-experimental ways to do this research,” Schroepfer continued. The investigation “would have benefited from more extensive review by a wider and more senior group of people.”

So how will Facebook’s data-collecting framework change moving forward? In his announcement, Schroepfer claims that the company has given researchers clearer guidelines, which include adhering to a more rigorous review process if the area of inquiry is “deeply personal” (like emotions). It has created an internal panel of researchers and officers to further evaluate studies that meet the guidelines. It has “incorporated education on our research practices” into the training that all new Facebook employees must undergo. And it has unveiled a website where it will publish some (but not all) of its academic findings.  

The tone—humble, well-intentioned, admirable—almost obscures the fact that these actual revisions seem pretty meh. Will all the added committees and language solve any of the original problems? Schroepfer is silent on whether Facebook’s fresh-baked guidelines include obtaining consent from potential test subjects. And, as Princeton privacy expert Zeynep Tufekci told the Wall Street Journal, limiting the review board to just Facebook employees implies a less-than-perfect willingness to really weigh ethical concerns against the company’s interests.

On the other hand, as University of Maryland law professor James Grimmelmann points out, Facebook’s innovations here make it “an industry leader,” if only because “the industry’s standards are so low.” OKCupid, also under fire in recent months for its inquisitive shuffling of members’ romantic search results—very cute, guys—has not announced any research policy updates. (Grimmelmann’s support of the Facebook announcement seems particularly noteworthy because he has argued that Facebook and OKCupid may have violated the law with their research practices.)

Readwrite’s Selena Larson is probably correct when she warns that the only surefire way to protect yourself from being a Zuckerberg lab rat is to flee the site entirely. What she might have said, but didn’t: If you are especially sensitive to big data behemoths fiddling with your feelings, you may not want to be online in the first place.