I'm sitting here with four passports on my desk. They all have photos. The subject of one looks like a 9/11 hijacker. Another looks like a high-school boy delivering pizza. Another looks like a washed-out ex-kid TV star who's been busted for drugs. Another looks like a Latin American child who needs a liver transplant.
The people in the photos are, respectively, me, my wife, my son, and my daughter.
I'm sharing this embarrassing information with you for two reasons. One, because it's already in the possession of the U.S. government and every government that coordinates with ours. And two, because today's news file brings us a story from the Guardian about British plans to scan passengers at airport gates using face-recognition software . The idea is to "improve security and ease congestion."
The scans certainly will improve security. But that's because of a human decision as to how the machines will be programmed. And that decision, in turn, might exacerbate, rather than ease, congestion. To make this kind of technology work, we have to understand that it requires human management and human assistance.
Face-recognition software looks for a match between the passenger at the gate and a stored photo. There are two kinds of photos you can ask the computer to match. One is a collection of bad guys whose pictures the government has stored in a database. The other is the photo stored by the government as the face that goes with the chip in your passport. Let's call the first kind a suspect scan and the second an ID scan.
If we set up the scanner to look for suspect matches and it can't match you to any of the bad guys in the database, you go through. But if we set up the scanner to look for an ID match and it can't match you to your passport photo, you have a problem. The suspect scan puts the burden of matching on the computer. The ID scan puts the burden of matching on you.
My passport photo was taken nine years ago. I had a lot more hair. I wasn't wearing glasses. I looked tanner and stronger. Last month, when I went through airport security, the officer took a good, long look at both me and my passport. She had to look past the changes of nine years and evaluate both images for subtle similarities.
If she'd been a computer, I probably would have flunked that test. Remember, the whole point of using computers to relieve congestion is that they'll scan us and render their decisions more quickly than humans do. My wife might have flunked, too. And even though my kids' photos were taken just a month ago, I can see how the sheer weirdness of staged photography and the randomness of how they looked that day could cause them to be bounced as well.
This is the problem with the British plan. According to Guardian reporter Owen Bowcott, "Unmanned clearance gates will be phased in to scan passengers' faces and match the image to the record on the computer chip in their biometric passports." In other words, it's an ID scan. If the computer can't match you to your ID, you flunk. And there's nobody at the gate to follow up. You have to get in some other line or go through "additional checks."
It's not clear what "additional checks" means. Here's what it should mean: If the computer flunks you, a human being is on hand to give you and your photo a visual scan. Experts point out to Bowcott that current face-recognition software errs strongly on the side of not finding matches. That's fine: The decision to insist on ID matches is a human decision, and it does enhance security. But if you want to ease congestion at the same time, the computer's failure to match you to your photo can't be treated as a conclusion. It has to be treated as an initial sorting process that directs you to old-fashioned human scanners.
"There is concern that passengers will react badly to being rejected by an automated gate," Bowcott reports. I'll say. I'm a fan of high-tech security scans, even when they see through your clothes . But technology alone is never enough. The buck still stops with us.