Future Tense

Fighting Tech With Tech

Can apps cure our smartphone addiction? This neuroeconomist thinks so.

Jacob Ammentorp Lund/Thinkstock
A neuroeconomist says he wants to help you be the person you want to be.

Jacob Ammentorp Lund/Thinkstock

On a recent episode of If Then, Slate’s tech podcast, hosts April Glaser and Will Oremus spoke with T. Dalton Combs, the co-founder of a controversial startup whose mission is to help other companies make their apps platforms more addictive. They discuss the ethics of Dopamine Labs’ mission, how Combs and his company choose who they work with, and whether tech can solve issues with addiction.

A transcript of their conversation, which has been edited for length and clarity, is below. Or listen to the full episode below, and subscribe to If Then in iTunes.

Will Oremus: What is neuroeconomics?

T. Dalton Combs: Neuroeconomics—that’s the field I got my Ph.D. in—is about using models from economics to try to understand why people do what they do combined with neuroimaging and neuroscience data. We put people in MRIs, and we ask them questions, and we try to look at their brain activity and figure out why they’re doing what they’re doing. We’re taking these old models from economics and updating them with insights from neuroscience and actually looking at the brain to figure out how people make decisions.

Oremus: What’s one thing discovered from that very young field that has surprised people?

One of the big findings is that losses hurt a lot more than gains feel good. So losing a dollar hurts about twice as much as gaining a dollar feels good is one of the big findings in the field.

Oremus: How do you apply a finding like that to the work that you do at Dopamine?

The research we draw on looks at what causes people to make choices. Most of our choices are driven by the habit system, which has to do with you: You take an action, and then sometimes that action feels really good, and that causes you to do it more again in the future.

Let’s say you wanted to go to the gym more often. I’m waiting for you at the gym, and when you get there, I give you a big high five and say, “Great job for going to the gym today.” That’s going to make you more likely to go to the gym again in the future. But if you get that every single time, it just becomes background noise because you’re used to it. It’s just a normal gym experience, and it stops changing your behavior.

What we do is take that kind of finding—the wiring in the brain that we know underpins it—and we help developers figure out when and how to communicate with their users in order to make different actions and different parts of their apps more habit-forming.

Oremus: What is an example then of a tactic that a company can do to take advantage of this?

When a company comes to us, the first thing we do is figure out what are those critical actions in its app that really drive engagement. For a social app, it’s obviously posting. For a health care app, it might be taking your pills on time or going on regular walks.

Then we figure out a way to sometimes make it more rewarding than usual. This is usually just simple user-interface sugar stuff. When you text a friend a “Congratulations” iMessage, you get confetti falling from the top—those little elements are called UI sugar. It just makes it kick a little bit harder. Then, on a user-by-user basis, we change that sugar to make it more or less enticing to build that habit over time.

Oremus: Last week, Sean Parker, who was a co-founder of Napster and the first president at Facebook, made some comments about those early days of Facebook and how they consciously set out to sort of addict people or engage people in just the kind of ways you were talking about. What did you make of Sean’s comments? He actually seemed to be expressing regret over this and saying that this is something they should not have done.

I think it’s easy for him to say that it’s something they shouldn’t have done when he has this much distance between him and making those decisions. Once the engagement’s in his back pocket, he can be a little more straightforward about it.

Basically at any technology company, someone in that company is thinking in those terms. Whether you’re a meditation app, a personal finance app, a health care app, an education app—you have to be thinking about how you interact with your users, and the No. 1 thing that everyone wants is to see users using it more. It’s very candid of him to talk about that openly, but that basically any tech company that has a user interface thinks about its user interface that way.

April Glaser: I want to push back: He’s able to connect the dots by looking behind him and saying, “This addictive tech that we built has not necessarily been good for people. We do spend too much time buried in our phones, and it’s affected our relationships.” Why would we just dismiss that in saying, “It’s easy for him to say that now that he’s made money”?

I would take two things. One is that as long as tech companies have these ad-support models that are either economics or driven by eyeball hours, this is always going to be a tension in the tech industry about wanting to respect people’s autonomy while wanting to make sure they spend a lot of time on the platform. But I also think that you’re right: A lot of tech companies and a lot of entrepreneurs are seeing this problem, and they want to—even for cynical reasons—have a healthier relationship with their user base because they don’t want their users to hate the role that their technology plays in their lives. I don’t think Twitter is excited that people talk about how much they hate being addicted to Twitter. They want people to feel like they have a healthy relationship with their product. And I think that is a direction that a lot of technology companies’ user interfaces are moving.

Oremus: Do you have a lot of clients already who are using your tools?

We have about a dozen small-ish clients focusing on mostly health care.

Oremus: Do you have any concerns about a client who’s making an app that might be destructive to society in some way, and you’re going to help them make that more addictive?

Yes, we are selective about our clients. We watch how they talk about their users. Many publishers have a very dismissive, patronizing tone when they talk about their users or their readers, and we look to see if the critical action that they want to turn into a habit. We also look at, “Is this a behavior that’s actually good for the end user?” We use those questions to evaluate who we work with, and if you get too many zeros on those, then we’re not going to be able to help you out.

Oremus: Has any company so far gotten too many zeros and you’ve actually turned away business?

I would say about a third of our inbound gets turned away that way.

Glaser: There’s merit, instead of making it something addictive so that people will be drawn to use it, to just making it good, right?

I think that’s a great point. And I think you can see this really clearly in two products: Google and YouTube. They’re both owned by Alphabet, but the Google product, they measure their success in how quickly they get you off the home page. So they want you to get onto Google, find exactly what you’re looking for, and get gone. And that is their metric of success, whereas YouTube is all about time on platform. I do support the idea of finding ways to design products where the less you use them the more helpful they are, but I think there are inevitably going to be products that are more useful the more you use them—like a productivity tool, where you want to be clearing more items from your checklis.

Oremus: Give me an example of a company that’s missing out on an opportunity or doing something wrong in terms of your knowledge of how you keep people engaged.

There’s so many. Let’s talk about Sales Force. I imagine a lot of people have personal experience with touching a Sales Force panel. It’ll be really easy to make a Sales Force panel more engaging by adding these little hits of dopamine all over the place. Any meditation app is leaving a lot of Zen on the table by not thinking about how to make a mindful practice a habit of mind.

Glaser: I think that’s a ridiculous statement.

Oremus: Tell me, why do you feel like that’s ridiculous?

Glaser: Just leaving Zen on the table. I think that if the point is for people to live a more Zen life, maybe that would be putting the phone down, not engaging more with the app.

But I want to push back on something else you said earlier about the point of Google being for people spend less time on Google. I don’t really see that as being the case. Google is actually bringing more and more information into its search page, so Wikipedia articles are being brought in as snippets, and restaurant reviews are being brought in on the side, and you can play videos actually on the search page. I actually see them trying to increase engagement on Google Search as opposed to trying to get people off.

And this is just kind of the way platforms work, where you’re stuck in these walled gardens and making more of them, where it’s hard to find a door out. Or you just feel like you stick with this product that you are now invested in—perhaps now emotionally—just doesn’t seem healthy. It doesn’t seem like the direction at least I want my internet usage to take.

I guess this might be a good time to talk about Space. This is the other product that we make that is designed to go directly to users who feel the way that you do about their relationship with technology. They feel like they don’t like all of these tricks being used without them knowing to get them to spend more time in Twitter than they wish they would spend. Space works the opposite way then the Dopamine API does. It works by putting a break—a space—between you and the most addictive apps on your phone.

Glaser: There’s all kinds of apps like that get you to stop using social media and stuff like that.

It’s not to get you to stop using. Let me try to walk you through the user story. Let’s say I spend way too much time on Reddit. I use the Reddit for Chrome plugin. When I type in reddit.com on my laptop, before Reddit loads, it loads a three to 12 breathing exercise. And that breathing exercise does two things. One, it gives me a moment to break my flow and say, “Is this really where I want to be right now?” And by putting a gap between me clicking on that bookmark and Reddit loading, it weakens the habit loop in my head. On top of that, that gap of how long the breathing exercise is is modulated so that when Space thinks I’m being highly compulsive, it’s a much longer gap, and when it thinks I’m being deliberate it’s a very short gap. It’s designed over time to encourage a more deliberate interaction between us and our technology.

Glaser: It seems parental. Fighting fire with fire makes more fire, and fighting tech with tech might just make more tech.

Oremus: I might be the gullible one here. I’m like, Ooh, that sounds cool. I wanna try it.

There was a story about Dopamine Labs recently on TechCrunch, and people were debating it on Hacker News, the popular tech discussion board. I’ll read the top comment on there:

This organization is disgusting and is evidence enough that our industry has no sense of ethical responsibility when massive regulation lands on Silicon Valley and we whine about the impact it has on innovation, remember companies like Dopamine Labs who truly deserved it.

Defend yourself against that, Dalton.

The techniques that we’re promoting are extremely popular in Silicon Valley. What we’re trying to do is bring to the forefront that people are doing this stuff, that Facebook and Twitter and every company out there want you spending more time on their platforms. We’re trying to educate people about how companies accomplish that. Then we’re trying to build tools so that those same mechanisms of habit formation can be used for good, because I don’t think that it’s morally bankrupt to have people brushing their teeth more regularly or taking regular breaks from their computer to go on walks.

This rigorous technology of the mind is really what we need now as a civilization because the thing that’s killing people nowadays is too much Facebook and cheeseburgers. We solved the problems of the biological age by vaccines and antibiotics and discovering all these things, and stopping the things that were killing people, and we’re not going to find the technologies to fix these behavioral problems of addiction, technology overuse, overconsumption of everything by walking away from the technologies of the mind. We’re going to solve them by getting rigorous and having a complete science and technology so that people can reprogram themselves into the people they want to be.

Glaser: Something I think a lot about is the difference between habit and ritual. Habit is something I do because I don’t intend to do it again and again and again, and ritual is something I do very intentionally. I, personally, want to have more autonomy such that I intend to do what I need to be redundant about, not something that I do without consciousness or without knowing it. But, that said, having things that automatically happen for you sure is convenient, and that might be something that a lot apps want.

When you go out to lunch do you get a soda or a water?

Glaser: I usually have a water bottle on me.

Do you think that is a deliberate moment-to-moment decision, or are you that way because carrying a water bottle with you has become a habit?

Glaser: I carry a water bottle with me because I forget a lot. When I remember, I’m happier.

I always get water, and it’s just a habit. It’s not something I think about. I don’t think, Do I want soda or water? The answer’s always water because I’ve trained myself: The soda’s not good for you, Dalton. It’s going to taste good, but it’s not what you should get right now. And I think that is how a lot of people are and why some people look more diligent than other people, why some people have different diets than other people. I think a lot it comes down to the habits that we build into ourselves. A rigorous technology that allows anyone to decide Man, I wish I was the kind of person who got out of bed at 6 in the morning and went to the gym. Imagine if there was just a button on your phone that you could click that would cause you to turn into that person. That’s where a rigorous technology of the mind will take us.

Oremus: Dalton, please build me that button. I want it.

If Then is presented by Slate and Future Tense, a collaboration among Arizona State University, New America, and Slate. Future Tense explores the ways emerging technologies affect society, policy, and culture. To read more, follow us on Twitter and sign up for our weekly newsletter.