In the wake of the Great Recession, young men have become something of a conundrum for economists. Data from the Bureau of Labor Statistics say that men between the ages of 21 and 30 show reduced labor force participation compared with older men and women of all ages. For example, the market hours (a labor economics term for the hours a person can choose to spend on either work or leisure) young men worked fell by 12 percent between 2000 and 2015, compared with a drop of only 8 percent for older men. Yet, according to the General Social Survey, young men reported increased happiness during this same period of time.
Want to listen to this article out loud? Hear it on Slate Voice.
Get Slate Voice, the spoken edition of the magazine, made exclusively for Slate Plus members. In addition to this article, you’ll hear a daily selection of our best stories, handpicked by our editors and voiced by professional narrators.Start your free 2-week trial
To listen to an audio recording of this article, copy this link and add it to your podcast app:
For full instructions see the Slate Plus podcasts FAQ.
A draft paper made available by the National Bureau of Economic Research, a clearinghouse for working papers in economic research, links at least part of this difference to video games: Statistical modeling by the authors suggests that young men are working an average of four fewer hours per week than they once did and spending three of those four hours playing video games.* The gaming doesn’t account for the full difference in workforce participation between younger and older men, but they estimate that it accounts for between 38 and 79 percent of that variance. We still need further research to support this finding, but if it is true, then video games could be a significant factor in why young men are struggling in the workforce. It’s worthwhile to ask: If young men really are eschewing work in favor of gaming, then what might that tell us about the role games play in their lives?
The authors’ model shows that gamers value time spent on video games far more highly than television, which itself ranks well above other forms of leisure. But they don’t attempt to answer what it is about video games that makes them so compelling. One possible explanation is that games, unlike other forms of entertainment, effectively simulate the most positive (and only the most positive) aspects of work itself.
Consider the differences between what a gamer gets from a good Destiny session and what a TV addict gets from the tube. Television can relieve boredom, and to an extent it can cover loneliness with simulated on-screen companions, but it can never provide the pride that comes of mastering a new skill, setting and reaching a goal, or besting an opponent. Video games can do that, and newer games in particular offer a feeling of accomplishment without forcing the player to experience the frustration that comes with actually having to master a difficult skill.
Popular series like Assassin’s Creed, Mass Effect, Fallout, and Dishonored all look great and offer a fun and immersive experience, but aren’t really much of a challenge for most players on the default setting. These games follow a template familiar to most gamers: The new player is presented with a handful of things to master—weapons of different ranges and power, unique vehicles or ways of crossing terrain, a hand-to-hand combat system, an inventory to manage, etc. At first it may be a bit overwhelming, but basic facility comes quickly. From there the player gets more powerful weapons, new abilities, or increased capacity that roughly tracks with any increased difficulty in the scenarios the game presents him with. You get a feeling of perpetually gaining strength and conquering new difficulties—even if a new challenge takes a try or two, it’s never so difficult that a relatively unskilled player doubts his or her ability to conquer it. There are, of course, difficult games and harder settings for easier titles, if you want a challenge, but modern blockbuster games have tended to keep things accessible in order to have the widest appeal possible.
In the real world, the experience of slowly gaining mastery of a difficult skill and eventually being rewarded for it is called “work.” Work can be tedious or frustrating, and it often carries with it a risk of failure or embarrassment, but in the right circumstances it can be deeply rewarding. When those rewards are stripped of risk, commodified, and offered as entertainment, you get modern video games. Is it any wonder that some people may be choosing them over the real-life analogue?
In social science there’s a framework called self-determination theory, developed by psychologists Edward Deci and Richard Ryan, that seeks to explain human motivation. It suggests that humans are not driven simply by rewards and punishments, but also (and in many cases even more strongly) by innate psychological requirements for autonomy, competence, and relatedness. Activities that are driven primarily by these three basic factors are considered intrinsically motivated; extrinsically motivated actions are undertaken to gain rewards or avoid punishment. Deci found that if you offer people money for an intrinsically motivated task—like working on a puzzle or generating newspaper headlines—people will actually spend less time on it.
Psychologists in the field have since sought to facilitate intrinsic motivation to improve learning in schools and employee investment in the workplace. They’ve found that giving people more choice of tasks (autonomy) tends to increase their motivation while restricting them decreases it. They’ve also found that offering positive praise instead of money (reinforcing competence) for an intrinsically motivated task increases subsequent time spent on that task rather than decreasing it. Tasks that involve an element of creativity or skill tend to be intrinsically motivated while simpler, more repetitive tasks are more extrinsically motivated and respond more normally to rewards and punishments.
That brings us back to video games. Games have always offered the player a chance to experience competence by requiring them to solve puzzles or master new skills. In this way they’re similar to other intrinsically motivated tasks like working on a physical puzzle or playing a sport. Most of today’s games include elements of autonomy so that players can make choices about where to explore, which goals to pursue, or how to customize their characters and gear. That’s two of self-determination theory’s basic psychological needs. The one that remains is relatedness—a feeling of connection to other people. With the advent of massively multiplayer online role-playing games and live-streaming services like Twitch, social contact is increasingly part of gaming. It’s likely no coincidence that the people who are most likely to feel comfortable and find their peers in these social gaming environments are young men—the very people who are apparently choosing to forgo work hours in favor of more game playing.
Of course, we want our hobbies to fulfill us, and you can find competence, autonomy, and relatedness in anything from softball to crochet to crossword puzzles. If you’re a young man living in a community where the available jobs are repetitive and low-skilled, offer little prestige, don’t have a path for advancement, and aren’t particularly well compensated, however, there may not be many other opportunities for you to meet these needs. Video games offer an alluring, almost sinister ability to flatter you into feeling competence, soothe your need for autonomy by offering in-game choices, and connect you to other people. Games may be doing their job too well, keeping players from seeking true creative outlets, forging independent paths through life, and achieving in school or the workplace, because their needs are being blunted by a synthetic substitute.
For decades, popular criticism of video games has focused primarily on their content, their plots, their characters, their violence, or their sexuality, as if games were basically the same thing as movies or television. This makes no sense. The experience of playing a game is nothing like watching a movie or a TV show, or any other passive consumption of entertainment. In games the content is secondary—it’s the play that matters. But if new research is correct—and admittedly that is a big “if”—then we may need to worry. All play and no work makes for very dull boys.
This article is part of Future Tense, a collaboration among Arizona State University, New America, and Slate. Future Tense explores the ways emerging technologies affect society, policy, and culture. To read more, follow us on Twitter and sign up for our weekly newsletter.
*Correction, July 26, 2017: This article originally misidentified a draft paper on working and video games habits among young men as a published study. (Return.)