In a classic bit of black humor from the 1970s, Saturday Night Live once put on a sketch that imagined a Jerry Lewis movie called The Nutty Air Traffic Controller. Air traffic control is not an endeavor that can easily accommodate errors, much less slapstick pratfalls, so the Washington Post caught my attention Dec. 31 with the headline, "Air Traffic Controllers Made Record Number of Mistakes in 2010." According to the story, written by Ashley Halsey III, the number of air-traffic-control errors recorded nationwide increased in fiscal year 2010 by 51 percent, to 1,869. An earlier Aug. 30 story by Halsey had reported that with one month to go in fiscal year 2010 (October 2009 to October 2010) the number of air-traffic-control errors recorded nationwide had increased by an even wider margin, from 754 to 1,257. That's 67 percent. Whether 51 percent or 67 percent, this kind of increase in air-traffic-control errors struck me as something to worry about. So I clicked over to the Federal Aviation Administration's Web site to find Halsey's data source. It wasn't there. So I phoned the FAA. A press officer there said the information had not been made public. That struck me as peculiar. When the government collects statistics about its performance, such data is typically made public; often it is required to by law, especially when the data bears any relevance to consumer safety. The question of how often air traffic controllers make errors has no bearing on national security, and answering it risks no potential release of information that private companies might wish to remain proprietary. Why the secrecy? Later that day, the FAA press officer sent me a statement the FAA had prepared in response to the Post stories. The gist was that any increase in errors was attributable to the FAA's transition in fiscal year 2010 "to a non-punitive error reporting system." Before the switchover, air traffic controllers didn't always report their errors to their superiors because they feared reprisal. Under the new system, there was no reprisal when an air traffic controller reported a boo-boo that his boss failed to notice. Granted such absolution, air traffic controllers were reporting more boo-boos.
The trouble with the FAA's rationale was that it didn't answer (or even acknowledge) something Halsey had reported in his December Post story. According to Halsey (who cited unnamed sources at the FAA and the National Air Traffic Controllers Association, successor union to the Reagan-smashed PATCO), the official error count didn't include self-reported errors. Thinking maybe I had misread the piece, I phoned Halsey. That's right, he said; the 51 percent increase could not be blamed on the uptick in self-reported errors because it didn't include them.
The FAA's statement basically confirmed Halsey's December error count for fiscal year 2010 (Halsey said 1,869; the FAA said 1,889). But it didn't give me the fiscal year 2009 error count so I could confirm Halsey's calculation that errors had increased by 51 percent, and when I asked for it I received no reply from the FAA press office. The FAA also said in its statement that of the 1,889 errors only 445 concerned events classified as A or B (the more potentially dangerous the error, the higher the grade). But it didn't give me the fiscal year 2009 error count for A and B events, so I couldn't calculate whether these had increased, and if so, by how much.
At this point I decided to do what journalists do when they run into a brick wall. I filed a formal online request for the data under the Freedom of Information Act. I asked for "expedited processing" on the grounds that the data had already appeared in the Washington Post. "If the Post's data and interpretation are correct," I wrote, "then there are serious safety concerns regarding the air traffic control system that the public has a right, indeed a compelling need, to know." Mere seconds after I pushed the button, an e-mail from the FAA appeared in my inbox. The e-mail contained the text of my FOIA request and a photograph of me. That creeped me out a little, because I had never before submitted a FOIA request to the FAA, nor even talked to the FAA, that I could remember. The photograph was from my Facebook page. Oh well, I told myself. If the FAA's computers can figure out this fast what I look like, the agency ought to be able to locate the data I requested within a day or two.
That was Jan. 5. By Feb. 9 my FOIA request still hadn't been answered.
I decided to contact the FAA public affairs office again. I asked: May I have the data now? The reply: Sure! The agency promptly sent the same unhelpful written statement it had sent the month before. I complained: No, this is what you gave me before! Finally I was put in touch with Laura J. Brown, deputy assistant administrator for public affairs. Brown was able to answer my questions.
Right off she said that Halsey's error number for fiscal year 2009 was itself an error. I wasn't sure which number she was talking about—the 754 air-traffic-control errors he reported in August or the roughly 1,200 he reported in December. (My algebra's a little rusty, but if the fiscal year 2010 number Halsey reported was 1,869, and if Halsey wrote that represented a 51 percent increase over fiscal year 2009, then the number for fiscal year 2009 would have to be a little more than 1,200.)
The correct number for air-traffic-control errors in fiscal year 2009, Brown said, was 947. Which would mean (using the FAA's own figures) that air controllers' error rate didn't increase by 51 percent, but rather by, uh … 99 percent.
But wait! Comparing 2009's 947 with 2010's 1,889 really wasn't legitimate, Brown said, because Halsey was wrong when he said the error totals didn't include self-reported errors. Or rather, he wasn't entirely right. It was true that there were two separate databases, one for self-reported errors and the other for the official tally of errors. But often the same error would get reported to both databases. How often? Brown couldn't say. But there was a huge increase in category C (i.e., least dangerous) errors between fiscal year 2009 and fiscal year 2010. Self-reported errors are likeliest to be C errors, because it's difficult to make a really scary Jerry Lewis-Nutty Professor-type error and not have anyone else notice. There were 618 C errors in fiscal 2009 and 1,444 C errors in fiscal 2010. If all 826 of the additional C errors in 2010 were attributable to self-reporting, and therefore excluded from the 1,889 total, there would still be an apples-to-apples increase in errors. But it would be a more negligible 12 percent.
The hitch is that we don't really know how many of the 826 C errors were self-reported. It's actually pretty doubtful all 826 were self-reported because the new self-reporting system was phased in gradually through fiscal year 2010.
Brown also gave me the missing numbers for category A and B (i.e., the more-scary) air-controller errors for fiscal year 2009. They added up to 329. That means (I later calculated) that A and B errors dropped by about 26 percent between 2009 and 2010. But that good news is tempered by an increase of errors in the absolute scariest category, A. These rose from 37 to 44, i.e., by 19 percent.
I asked Brown why the FAA didn't answer the Post story by removing the duplicates from the regular database so we wouldn't have to guess what the apples-to-apples trend was for all categories of air-traffic-control errors from year to year. "I understand what you're saying," she replied, "but that's not how we keep the data." But, I protested, you have both databases. You surely have the means to identify every error that gets reported to the FAA—where it happened, what it was. They number fewer than 2,000 per year!
Brown answered by telling me there was another complicating factor—something called the "terminal area reporting system." That's an automated error-reporting system that tattles on air traffic controllers who commit errors. It's been used for years on planes at high altitudes, but during the past two years it's been phased in for planes operating at low altitudes. The phase-in messes up the kind of comparison I want to make, and so does the fact that at low altitudes the automated error-reporting is switched on at some times and not switched on at other times.
Another factor, Brown said, is how many planes are flying one year versus the previous year. Since the recession the trend has been downward. Which means, I answered, that there are more errors by air traffic controllers even as they have less air traffic to keep track of? Isn't that kind of … bad?
Please remember, Brown said, that the FAA's goal is to constantly improve its system for catching errors.
I'm all for that. But I don't see why the FAA won't make a parallel effort to keep more precise track of whether the performance of its air traffic controllers is getting better or worse. That would tell consumers something about whether flying was getting safer or more dangerous. Could it be that the FAA would just as soon we didn't know?