Future Tense

Advice to My E-Coach

My smartphone wants to tell me what to do. I have some suggestions of my own.

smartphone fitness tracker.
A new generation of apps is helping us to lead physically and mentally healthier lives.

Photo illustration by Juliana Jiménez. Photo by Thinkstock.

A version of this piece originally appeared in Issues in Science and Technology. It’s based on the report “Sincere Support: The Rise of the E-Coach.”

Until recently our friends were the ones who knew us best, and perhaps even knew what was best for us. But nowadays that role is being claimed by our smartphones. Thanks to sensors and apps, they know which shoes, books, and music we might like. And a new generation of apps is also helping us to lead physically and mentally healthier lives.

Over the past two years I’ve studied the rise of apps aimed at monitoring and improving our personal lives. They signal the emergence of the digital lifestyle coach, or e-coach. The e-coach differs in many ways from its analog predecessor: Its approach is data-driven, its monitoring regime is continuous, and its feedback is real-time and (hopefully) right on time. The promise is that with its objective data, clever algorithmic analysis, and personalized feedback, the e-coach will help me change my behavior for the better.

But before it can achieve that promise, there are a few things that I feel the e-coach seriously needs to improve. So for now, let’s switch roles, and let me give the emerging digital coach a few words of advice.

Be Honest About Your Imperfections

Getting accurate measurements of human behavior is tricky. For example, popular activity trackers tend to underestimate the activity levels (such as calories burned or distance walked) of people who walk slowly, like pregnant women or obese people. Activities with more subtle movements, like yoga, are also tough to measure. One user we interviewed during our research said that his heart-rate monitoring wristwatch would work only when it was strapped so tightly onto his wrist that it was uncomfortable and left a mark on his skin—and even then the data it provided were incomplete and inaccurate. So the tracker eventually ended up in a drawer—the fate, according to a survey by Endeavour Partners, of more than 50 percent of activity trackers.

Regardless of improvements in technology, the apps and gadgets designed to help me improve myself will likely continue to make plenty of errors for the foreseeable future. How can I best take advantage of what my imperfect smartphone has to offer?

Research in robotics might provide an answer. Studies have shown that people are more inclined to trust technology if it communicates clearly and honestly about its limitations. In flight-simulator experiments, for example, pilots will trust and collaborate with a system more effectively if it informs them when it is unsure about its own judgment.

Apps are also going to have to do a better job explaining why they give the advice that they do. If your stress monitor thinks you are stressed out, but you don’t feel stressed at all, you can’t ask how it came to its conclusion. Maybe it misinterpreted the data? Or maybe you are not accurately sensing your own stress?

At Delft Technical University, researchers are working on software that is able to provide users with the reasons for its actions. If applied to digital coaches, such a design could inform users about how advice is constructed. For instance, the app could display the measurements that led it to conclude that the user was stressed and the reasoning why it recommended taking a walk to help ease that stress. Honesty is the way to go for me—even if that means that a “smart” app must admit that it doesn’t know everything.

Stop Talking Behind My Back

The digital coach is data-driven. In the process of monitoring and giving feedback, a continuous stream of data is collected about our behavior. This information has a literally intimate quality, as was shown when Fitbit users’ sexual activity showed up in Google searches. When you trust an app to collect and analyze intimate data, you want to be sure that those data are handled with appropriate care and confidentiality. But this is not the case. Research by the Federal Trade Commission showed that the types of data health apps and wearables are spreading are not just anonymized activity patterns, but usernames, email, unique device identifiers, and other highly personal data.

The ways in which data from digital coaching apps are being traded, sold, and reused remain largely opaque. A diabetic colleague of mine tried to find out what happens with the information from her wearable insulin pump when she uploads it into the cloud. By studying the fine print of her privacy policy and making several calls with the service provider, she learned that data she thought were used only for telecare purposes were actually also used (after being anonymized) for research and profiling. But she was unable to find out exactly how the data were being analyzed and who was doing the analysis. This worries her because the cloud service, which she used to pay for but is now free, encourages users to also upload their Fitbit data, suggesting to her that the costs of the service are now being covered by monetizing her data.

These sorts of concerns are not a solid basis for a trusting relationship to emerge between human and app. Giving users clear, transparent choices about how their data will and will not be used can pave the way for a more healthy and trusting relationship.

I recently talked to someone at an insurance startup that uses data from driving behavior to establish personalized premiums. It gives users clear information about what data are being collected and how the information will and will not be used, as well as the controls to manage their data and even delete the information after the premiums are calculated. Its customers are very supportive of this type of transparent data use, and both sides benefit from the openness. I think the same approach would work for a digital coach.

Just Let Me Be Me

My final word of advice to digital coaches would be to respect that people are different—there isn’t a one-size-fits-all healthy lifestyle. Health apps promote a certain image of health and well-being. Usually that image is based on some set of guidelines about how much exercise you should get, and how much fruit and how many vegetables you should eat. But for some, a good life might not entail strict compliance with an app’s exercise or dietary standards; it would instead leave more room for dining with friends, baking cookies with your kids, or enjoying a little sloth.

Time magazine reports on an app aimed at kids that helps them manage their eating habits using a simple traffic light feedback system. High-calorie foods such as ice cream get a red-light classification; foods that should be consumed in moderation such as pasta and whole wheat bread are yellow; and things that you can eat as much of as you want, such as broccoli, are green. The article reports on a young girl who, since she started using the app, is now seeing the world in red, yellow, and green. “Everything,” she says, “corresponds to how many red lights you eat.” While providing a useful tool for managing a diet, the app also instills a very specific perspective on food—a perspective informed by calories rather than other qualities of food, like its social aspects, and a perspective that makes eating something you succeed or fail at.

What is good for one person doesn’t automatically work for another. By promoting certain actions and discouraging others, a digital coach presents a single view of what is good and what parameters such judgments ought to be based on. One interesting exception is the Dutch Foodzy app, which encourages healthful eating but doesn’t try to be a dictator about it. Foodzy users can earn badges for healthy as well as unhealthy behavior. You can get awards for stuffing away fruit and vegetables, but you can also become the king of the BBQ or claim a hangover badge by consuming a certain amount of alcohol.

Samsung called one of its smartphones a “Life Companion”—an appropriate description of a device that assists us in almost everything we do. But a real companion has to be reliable and honest, it must have integrity, and it should respect my personal values. These attributes, by the way, are part of the professional code that human coaches must live up to. Our pocket-companions still have a long way to go before they can earn our trust.

A version of this essay originally appeared in Issues in Science and Technology. Future Tense is a partnership of Slate, Arizona State University, and New America; Issues in Science and Technology is a publication of the National Academies in partnership with Arizona State University and the University of Texas at Dallas.