Are Facebook and other social media companies intentionally exploiting people’s psychological vulnerabilities to keep them addicted?
You bet, says Sean Parker, who made a fortune as an early Facebook investor and its first president. In an interview with Axios’ Mike Allen this week, Parker said that he has become something of a “conscientious objector” to social media. And he reflected with some regret on his own role in helping to mold the sort of company that Facebook would become.
“The thought process was all about, ‘How do we consume as much of your time and conscious attention as possible?’,” he said. “And that means that we need to sort of give you a little dopamine hit every once in a while, because someone liked or commented on a photo or a post or whatever, and that’s going to get you to contribute more content, and that’s going to get you more likes and comments. It’s a social validation feedback loop. … You’re exploiting a vulnerabilty in human psychology.”
Parker went on: “I think the inventors, creators—it’s me, it’s Mark [Zuckerberg], it’s Kevin Systrom at Instagram, it’s all of these people—understood this, consciously. And we did it anyway.”
There’s a weird element of humblebrag in Parker’s comments: He seems to be claiming more credit for Facebook creation than he probably deserves, given that his stint at the company lasted only a year. And we should probably take with a grain of salt any sweeping assessment of Facebook based on the personal experience of a guy who hasn’t worked there in 12 years. (The news feed hadn’t even been invented yet when he left following his arrest on suspicion of cocaine possession, for which he was never charged.)
Even so, Parker’s recollections are instructive because they undercut the company’s claims to be driven by the lofty mission of making the world open and connected—or as the latest version has it, building community and bringing the world closer together.
They also raise a deeper question: Is it wrong to build products with the goal of making them addictive? And if so, does that make social media companies that optimize for things like “time spent” and “engagement,” well, evil?
It’s not a new question, obviously. But it’s one that some in Silicon Valley seem to be taking more seriously in the wake of widespread disillusionment over social media’s role in politics and civil discourse. I wrote last week about how Zuckerberg has taken up the phrase “time well spent”—as opposed to just “time spent”—as a new way to articulate the platform’s ostensibly benevolent goals. (The technology critic who popularized that phrase, former Google ethicist Tristan Harris, found Zuckerberg’s use of it to be disingenuous.)
At the same time that Facebook was taking flak from Parker for its addictive qualities, another Silicon Valley startup was provoking similar criticisms. TechCrunch this week profiled a company founded by a neuropsychologist and a neuroeconomist with the goal of using machine learning to make other companies’ apps more, well, addictive. The startup’s name, aptly enough: Dopamine Labs. (It was also featured in an April episode of CBS’ 60 Minutes on the theme of “brain hacking.”)
Per TechCrunch’s story, Dopamine Labs has built a software program called Skinner—yes, it’s named after the behaviorist psychologist B.F. Skinner—that monitors an app’s various prompts and notifications and how users respond to them. The goal is to help companies tweak those features to make them maximally sticky. The company claims its service can add an average of 10 percent to the revenues of the startups that use it. “If all of that sounds creepy,” writes TechCrunch’s Jonathan Shieber, “don’t worry, it is.”
To counteract the effects of its primary business on users—and perhaps on its own creators’ conscience—Dopamine also offers an app called Space that helps people manage their notifications and the time they spend online.
Predictably, the coverage has generated some backlash. A post about it on the tech bulletin board Hacker News drew the following top comment from user “013a:”
This organization is disgusting and is evidence enough that our industry has no sense of ethical responsibility. When massive regulation lands on Silicon Valley and we whine about the impact it has on innovation, remember companies like Dopamine Labs who truly deserved it.
François Chollet, a Google engineer who’s well-known in the A.I. community, reprimanded the company on Twitter:
Dopamine, an "AI platform" to engineer addictiveness. Saddest thing I've read about today. If you work in AI, you have de facto some power over others, so remember to always exercise this power responsibly. https://t.co/tcT7t1QETc— François Chollet (@fchollet) November 8, 2017
It’s true that Dopamine is unusually blunt about what it’s doing. But are its means or ends really much different than those of Facebook, Snapchat, Zynga, or any number of other companies who make their money by getting people hooked on their apps?
T. Dalton Combs, one of the company’s co-founders, doesn’t think so. “I think a lot of people in the technology industry are very sensitive to talking plainly about these issues, because everyone I think has a lot of anxiety and guilt about them,” he told me in a phone interview. “And so when someone stands up like Sean Parker did, or like Dopamine does, and talks plainly about these techniques and their benefits and their costs, everyone kind of freaks out. I mean, we know these kinds of technologies are why Google’s free and Facebook’s free, and why everyone in tech is able to make the salaries they do. But they scare us.”
Combs added that the company views its role as trying to democratize the user-retention techniques that big social media and gaming companies have already mastered, so that smaller apps and startups—including some focused on self-improvement and social causes—can compete on a more level playing field. Put another way: Unicorns shouldn’t be the only ones that can afford to be an attention vortex.
“Our goal isn’t more screen time for people,” he says. “What we’re trying to point out is that these technologies for changing people’s behavior can be used in a lot of ways. So far it’s mostly social media companies, companies with ad models, using them to increase on-platform time. But they can also be used to help you take your medication on time. Or get to the gym on a regular schedule. Or finish the online courses you’re taking.” He acknowledged that Dopamine doesn’t limit its clients to companies focused on social good, but he said it does turn away clients whose apps involve gambling or other “user-toxic” activities.
It’s easy to moralize about companies making their products addictive on purpose. And it really is refreshing when you hear of someone like Dong Nguyen, the Vietnamese game designer who pulled his viral hit Flappy Bird from app stores out of concern for its users’ time and well-being.
But we should be careful about conflating “addictive” apps or games with more destructive forms of addiction, such as substance abuse. Lots of activities can be addictive, colloquially speaking, without substantially impacting the players’ relationships, career, or physical health. While they may be analogous in certain respects, there’s plenty of moral daylight between running a tobacco company and designing a harmless app in a way that keeps people checking in a few times each day. That’s why a lot of mental health experts don’t consider “smartphone addiction” to be a useful medical diagnosis.
It’s useful to know that Facebook was focused almost from its inception on hooking users and getting them to spend lots of time on it—and that there are lots of other companies today employing similar techniques, sometimes with the aid of neuropsychological research. We should all be more aware that companies are using our own psychological vulnerabilities against us because it helps us to be on our guard against overuse. But it’s unrealistic, in our current capitalist environment, to expect Internet companies not to be working hard to insinuate their products into our lives in one way or another. That, for better or worse, is simply the prevailing business model for online media. And with free services that sell us to advertisers as their real product, it is part of the bargain we have so far accepted.
So where does that leave us? Back where we started, essentially: Whether Facebook or any other app is good or bad for us doesn’t just depend on how much we use it, but on how we use it—and what that usage does to us, both individually and as a society. Being addictive, in other words, doesn’t make a company’s products evil: Being evil makes them evil.