Future Tense

Play It Out Before You Live It Out

Are ethical video games the future of on-the-job training?

Video games on-th-job training
This could be your next workplace ethics training.

Illustration by Slate. Images by BlueSkyImage/Shutterstock and zizar/Shutterstock.

On Thursday, June 4, Future Tense—a partnership of Slate, New America, and Arizona State University—will host a debate in Washington, D.C., on the future of jobs. For more information and to RSVP, visit the New America website.

This is you: Jennifer Brown, a fresh-faced employee at Rocket Skate Roller Coasters who just wants to do her very best. But right now you have a problem: Dylan, your IT guy, is really handsy. You don’t mind—he seems to mean well—but your other co-worker, Carol, won’t stop complaining to you about Mr. Touchy’s tendency to “grope” her shoulder whenever he fixes her computer.

Well, this is awkward—for you. What are you supposed to do, tell her to confront him? Advise her to go to human resources? Or say she should just ignore it, because his hands linger on everyone like that?

This is a scenario created by Will Interactive, an innovative video simulations company that’s trying to modernize on-the-job ethical training. In Will’s simulations, you confront dilemmas you might actually face at your job, then watch your decisions play out before you. The goal is to initiate the complex thought process required for making tough decisions before you actually face one at work—so that, when the time comes, you’ve thought about your possible options and their consequences enough to feel prepared. Will’s slogan: “Play it out before you live it out.”

Most of us think of video games as an escape from reality, a way to present the world in more manageable, simplified form. Will’s games, by contrast, try to illustrate the moral gray areas that can come up with real-life ethical quandaries. “It gets you into the headspace of making these decisions, before you actually do,” says Sharon Sloane, Will’s CEO, an entrepreneur with a master’s in counseling and a background in education. Think of it as moral training wheels—or bowling in life’s bumper lane.

These kinds of simulations started as a way to address large-scale societal problems. The company released its first interactive in 1996, a simulation meant to prevent the spread of HIV/AIDS. Will was later commissioned to create a military simulation where players could inhabit both the mind of a soldier dealing with suicidal tendencies and of a friend of that soldier. In 2009, that simulation, Beyond the Front, became required viewing for every U.S. soldier and army civilian around the globe during a two-hour stand-down. (Psychologists were on hand in case of emotional distress.) After the session, suicide rates decreased 60 percent for the following nine months, according to Will’s data.

This kind of training makes sense for people in high-stakes professions, like soldiers and world leaders. But what about the rest of us? Will thinks these kind of first-person interactives might be just the thing ethical training needs to make it more relevant to everyday jobs. Because let’s face it: On-the-job ethics training is a joke. If you’re lucky, you might be instructed by your superiors to watch some standard sexual harassment training video—one that’s so dated and painfully acted that it’s pretty much impossible to take seriously—but for the most part, no one really teaches us how to deal with an ethical crisis until we find ourselves in the middle of one. By then, of course, it’s often too late.

Now, Will is expanding into the corporate marketplace to provide ethical training to us regular Joes. In the past few years, it’s introduced simulations for training engineers to deal with dangerous situations and academics to avoid falsifying research findings, as well as sexual harassment videos to help office workers make more ethical decisions. The goal is to use “the power of story, the magic of film, and the appeal of gaming to create highly immersive learning simulations,” according to Will’s promotional materials.

Great care goes into these sims: Will does months of interviews and psychological and legal research to make sure scenarios like Jennifer’s ring true to life. For other sims—such as one that simulates life as a returned Iraq soldier dealing with post traumatic stress disorder—game developers spent time in an in-patient treatment facility at Walter Reed interviewing troops, hearing stories of how many soldiers still slept with a weapon under their pillows and had trouble handling the stress of entering a crowded Walmart. For that simulation, which Sloane showed at a recent conference on using video games to deal with critical public health issues, Will drew primarily on real-life narratives of past soldiers, only lightly fictionalizing them to make them more universal.

While these simulations can be powerful for getting you into the headspace of certain extreme experiences, they’re not as good at capturing more mundane ones. Many of these scenarios don’t seem to resonate in the way they should; they feel wooden and forced. Why? The most striking thing is that the games all start with the phrase: “This is you.” Some, like the PTSD simulation, take place entirely in second person: That is, your emotions are dictated to you by an on-screen narrator. Will explains that it does this because it wants to put you in the shoes of the character; you need to believe in Jennifer and her character arc, her friendships, and how she occasionally likes to get orange chicken from the Chinese takeout place down the street for lunch.

But in this case, the effect is often not to make your character feel more intimate. Instead, it just feels intrusive. Any narrative resonates because it jibes with your own experience—it has emotional verisimilitude. When someone dictates your qualities and emotions to you, you resist identifying. Consider that Will uses the language of coercion to describe its products: The goal is to “modify behavior and improve human performance,” Sloane says.* Will’s games “lower people’s defenses and let messages come in that will cause them to reflect and potentially influence their attitudes and behaviors.” Those phrases come with a sinister undertone—it sounds a little like gay conversion therapy, or subliminal messaging, or something from Inception. There’s a feeling that your brain is being molded by a force beyond your control. No one wants to have her behavior modified.

Will’s goals—to create realistic characters and recreate some of life’s more nuanced emotional challenges—are unquestionably admirable. It’s a trend that’s sweeping commercial gameplay as well: Whereas the vast majority of commercial games are about “killing or destroying things,” what players are really after is an emotional connection to the character and story, says Brianna Wu, feminist video game activist and head of development at Giant Spacekat. Without that, she told me, “It doesn’t feel genuine. It’s all about what your character looks like. But the rest of the world—it’s so basic. It’s baby talk. If you mansplain a character, the person playing is going to hate you.” The same goes for simulations aimed not toward entertainment but toward “behavior modification.”

Will is part of a wave of simulations seeking to impact society by replicating more shades of moral nuance. One, a virtual reality simulation recently profiled by Wired, tries to teach cops not how to shoot but when to do so. (The answer is almost never clear.) These sims seek to simulate our surroundings in all their nuance, complexity, and sometimes, ugliness. Clearly, they’re working on some levels: Military leaders have even described Will’s as life-saving. Plus, as Sloane points out, in some ways the discussion and thought process that the “games” engender can be more important than the actual experience of playing them. Hours after watching the PTSD simulation, for instance, I found my mind wandering back to its haunting premise.

But to really get at the psyche of your average office worker, the company will need to rethink its approach. While I was clicking through the sexual harassment simulation, for instance, I found myself wishing the sim were more like the actual game The Sims, in which players can design some of the characteristics and personal qualities of their characters. Having more control over my character lets me feel more connected with it and empathize more with its struggle and goals. Alternatively, I felt like the sim might have worked better if it had kept the emotional nuance but relied graphically on simpler, more cartoon-like visuals so that I could project my own onto it.

Storytelling should be a seduction. You shouldn’t feel forced into feeling empathy or reaching an emotional conclusion. In the end, it isn’t about the technology: It’s about the writing.

Correction, May 31, 2015: This article originally misquoted Will Interactive CEO Sharon Sloane as saying Will’s goal is to “manipulate behavior and improve human performance.” Sloane said Will’s goal is to “modify behavior and improve human performance.” (Return.)