Originally posted by: episodic
http://online.wsj.com/public/article/SB...a_w_20070813.html?mod=tff_main_tff_top
At airport security checkpoints in Knoxville, Tenn. this summer, scores of departing passengers were chosen to step behind a curtain, sit in a metallic oval booth and don headphones.
With one hand inserted into a sensor that monitors physical responses, the travelers used the other hand to answer questions on a touch screen about their plans. A machine measured biometric responses -- blood pressure, pulse and sweat levels -- that then were analyzed by software.
The trial of the Israeli-developed system represents an effort by the U.S. Transportation Security Administration to determine whether technology can spot passengers who have "hostile intent."
Trained teams watch travelers in security lines and elsewhere. They look for obvious things like someone wearing a heavy coat on a hot day, but also for subtle signs like vocal timbre, gestures and tiny facial movements that indicate someone is trying to disguise an emotion.
"All you know is there's an emotion being concealed. You have to find out why the emotion is occurring," says Paul Ekman, a San Francisco psychologist who pioneered work on facial expressions and is informally advising the TSA. "You can find out very quickly."
The explanations for hiding emotions often are innocent: A traveler might be stressed out from work, worried about missing a flight or sad because a relative just died. If suspicions remain, the traveler is interviewed at greater length by a screener with more specialized training. SPOT teams have identified about 100 people who were trying to smuggle drugs, use fake IDs and commit other crimes, but not terrorist acts.
The technology isn't geared toward detecting general nervousness: Mr. Shoval says terrorists often are trained to be cool and to conceal stress. Unlike a standard lie detector, the technology analyzes a person's answers not only in relation to his other responses but also those of a broader peer group determined by a range of security considerations. "We can recognize patterns for people with hostile agendas based on research with Palestinians, Israelis, Americans and other nationalities in Israel
In the latest Israeli trial, the system caught 85% of the role-acting terrorists, meaning that 15% got through, and incorrectly identified 8% of innocent travelers as potential threats, according to corporate marketing materials.
The company's goal is to prove it can catch at least 90% of potential saboteurs -- a 10% false-negative rate -- while inconveniencing just 4% of innocent travelers.
======================================================
It is amazing how much of the things that naysayers said would 'never' happen is happening. So travel is going to boil down on how well you do on a lie detector eventually? Funny how noone will 'talk about' this development.
Thank goodness it is the WSJ reporting this, or some might never believe it.