The size of the problem space made it impossible. Any margin of error whatsoever, multiplied by the (number of people you're looking for + the number of people passing through the airport) leads to insane number of false positives. The German Federal Security Service did a trial with Siemens' recognizer many moons back, loved the technology, hoped the number of false positives would be small... and were disappointed. Even with an unreachably high efficiency, it kept tagging grandma as a terrorist.

It's like the birthday paradox: with only one chance in 365 of two people having the same birthday, it turns out that with 23 people in a room, you have a 50% chance of two birthdays matching. A 99% chance if there are 75 people. See http://danteslab-eng.blogspot.... As he notes, if you have a system that is 0.999999 accurate (one in a million), we have a 50% chance of a false positive or false negative as soon as we have scanned 1178 people... meaning for about each 1000 people we either arrest grandma or let Osaman Bin Laden stroll through.

They've probably reported that already, and been told "don't worry about mere mathematics, this is *politics*" (;-))