Future Tense

The Army Isn’t Getting Much Use Out of Its Expensive Training Games

OK, not the fun kind of games, exactly.

Joe Raedle/Getty Images

The United States Army has a problem: Its soldiers aren’t playing enough video games.

That, at any rate, is one conclusion you might draw from a recent Government Accountability Office report that examines the Army’s numerous—and often quite expensive—virtual training devices. Discussing that report in Motherboard, Richard Beckhusen concludes “the Army simply has too many games.” In some cases, that’s because the Army hasn’t properly integrated those systems into its constantly changing training regimes. In others, it may be because these pricey platforms simply don’t meet real needs in the first place.

As the GAO report defines it, “virtual training devices are those devices that involve a simulator, a simulation, or a computer-generated battlefield.” As such, a medical dummy probably wouldn’t count, but a VR simulation of field surgery might. In theory, these systems seek to re-create “conditions that are not possible to achieve in live training,” either because it would be expensive to do so (heavy weaponry munitions don’t come cheap) or because it’s hard to reproduce varied field conditions on demand. Fittingly, then, many of the systems that GAO examines simulate vehicle operations, while others help soldiers familiarize themselves with unit tactics and operational protocols.

The proliferation of training simulations in the military might seem to be of a piece with the general shift toward gamification in civilian settings—and the concomitant expectation that learning a new job or skill should be fun. While such ideas may simmer in the background here, few of the systems discussed in the GAO report seem like they’d be that enjoyable to play. For more than a decade, the military has used the game America’s Army for recruitment, attempting to give players a feel for the actual demands of service. Such militainment isn’t really on the table in the GAO report, however. Indeed, while words like “funding” and “function” crop up repeatedly throughout, mere “fun” is nowhere to be found.

It’s hard to say how much use the Army is actually getting out of its virtual training devices, partly because it didn’t keep strong records of actual engagement. Nevertheless, the GAO report’s inventory of Army-operated systems suggests that usage is limited at best: In 2015, trainees logged a mere 435 hours of time with the Army’s 18 units of one bulky-looking system designed to help develop “driving and operating skills in simulated weather, urban operations, and complex virtual terrain.” Given that the $12 million system cost $744,405 to maintain in that year alone (and that there were 33,332 hours of available simulator time), that seems like a relatively poor return on investment.

Other training devices performed better: A $216 million system that “replicates live weapons training events” plopped soldiers in front of its simulated screens for more than 300,000 hours lin 2015. Similarly, Virtual Battlespace II—which seeks to improve skills like “cultural awareness, language, [and] explosive device recognition”—appears to have performed reasonably well, racking up 18,673 user hours.

This moderate success, however, arguably squares with a point that Beckhusen extracts from the report: Many of the army’s “games” may be too realistic for their own good. “It’s unnecessary to strap soldiers into an immobile vehicle and make them scan a wrap-around screen if they can accomplish the same basic tasks with a mouse and keyboard,” Beckhusen argues. One commenter on Slashdot suggests that this is hardly a new problem, writing, “[B]ack in the early 90s we’d go to the M-1 simulator and run through that. Then go back to the barracks and play M1 Tank platoon on my Amiga 500. It was a running joke I had my own simulator in my room.”

The Army’s bulkier simulators presumably offer things that more commercial modern games—never mind Amiga titles!—can’t, but more conventionally gamelike programs may still offer advantages. As Beckhusen notes, for example, Virtual Battlespace “scales better” than many of the more sophisticated systems designed to train users on a particular vehicle or scenario. Significantly, they may also be more cost effective: The Army spent about $8 million on the flexibly designed Virtual Battlespace, but put almost 10 times that number into its Conduct of Fire Trainer for M2/M3 Bradley vehicles, which includes a sophisticated-looking cockpit full of real knobs, buttons, and sensors.

Obviously, neither dollars spent nor training hours accumulated translate directly into true usefulness with these systems. In fact, many of these amounts likely seem like rounding errors relative to the military’s massive annual budget. Nevertheless, the report still suggests that the Army needs to more fully integrate those games into its training schemes—evaluating both their potential efficacy before developing them and investigating their effectiveness once they’ve been deployed.