Back in Soviet times, there was a Russian army general who liked to bellow, "Analysis is for lieutenants and women." This brute-force approach to military matters didn't serve the Soviet Union well in the long run. Unfortunately, the same attitude seems to be creeping into the U.S. Army today.
Two pieces of evidence shine all too glaringly: 1) an official, unclassified, and highly critical report on the U.S. Army's inefficient-to-shoddy intelligence practices in Iraq and Afghanistan, written by the Center for Army Lessons Learned in Ft. Leavenworth, Kan.; and 2) the removal of this report from the center's Web site, after the Washington Post published a story summarizing its contents.
The report and its suppression make clear that, in pre-war training, combat deployment, and after-action assessments, the Army hierarchy in the field and the political hierarchy in Washington devote woefully scant resources to analysis of what they're doing—and that they hold the analysts themselves in contempt, sometimes lethally so.
Administration officials and their critics agree that better intelligence is needed to deal with the guerrilla war that's escalating daily in Iraq. So the release, late last month, of the "lessons-learned" report (which you can still read on the globalsecurity.org Web site) certainly dealt a shock. Some key findings:
- The 69 U.S. tactical human-intelligence (HUMINT) teams in Iraq were expected to produce at least 120 "information intelligence reports" a day, but they've been putting out, on average, just 30—"not because of the lack of activity but because of the lack of guidance and focus" from their superiors.
- Most of these superiors are junior military intelligence officers who "did not appear to be prepared for tactical assignments." Even captains "lacked advanced analytical capabilities."
- HUMINT databases were stored on separate computer systems, many of them loaded with incompatible software, none of them connected in such a way that the data could be shared. As the report dryly puts it, "Connectivity between the terminals was non-existent, and had an adverse effect on HUMINT mission capability."
Other phrases that pop up repeatedly in the report: "very little to no analytical skills," "lacked the foundations of collective management," "junior officers who had no formal training," "information overflow," "no internal analysis capability," "lack of competent interpreters," "no ability to analyze the information," and so forth.
The report also finds that HUMINT personnel were (and, one Pentagon official tells me, still are) often ordered to take part in four-man units that kick down doors and raid buildings. The report notes that it's a bad idea to use spies in this way: "THTs [tactical HUMINT teams] rely on the rapport they generate with the local population and their ability to collect information. Putting them on a door-kicker team ruins that rapport."
But here comes the killer (literally). The report adds, in wryly understated parentheses, that when HUMINT agents were assigned with a door-kicker team, "they were usually the #2 man, who statistically is the person who gets shot." (Italics added.)
In other words, intelligence-gathering and intelligence-analysis teams are held in such low esteem that they're supplied with mismatched computer systems, they're manned by junior officers (or more senior officers who've received little training), they're assigned to risky raid operations that have nothing to do with their missions, and, as if to place an exclamation point on their dispensability, they're put in the raid-team's most dangerous slot.
Why is this happening? Certainly no one in a position of authority explicitly wants to put spies at greater risk or to withhold resources from those who analyze the information that spies gather. A likely explanation is simply that the incentives—the systems of rewards and penalties that govern modern military life—are geared elsewhere.