I haven’t written up a “True Life Horror” article in a while, but when I came across this the other day my blood turned icy-cold in my veins and I had a flash-forward to an ever increasing and growing government intruding in and invading our daily and private lives. According to The Atlantic, the Department of Homeland Security is in the early stages of developing something called FAST that, when it’s installed at an airport near you, will attempt to catch terrorists before they strike.
FAST stands for Future Attribute Screening Technology, and the program will, according to The Atlantic, “remotely monitor physiological and behavioral cues, like elevated heart rate, eye movement, body temperature, facial patterns, and body language, and analyze these cues algorithmically for statistical aberrance in an attempt to identify people with nefarious intentions.” Are you fucking kidding me? So the DHS is now wanting to monitor our thoughts, intentions, and motivations? Screw George Orwell … not even he could foresee this s**t coming!!
The Atlantic points out two major problems with FAST. One, the research and development of this tech is being carried out in a laboratory setting, with volunteers being brought in and told to carry out a disruptive act. FAST is currently able to pick out the “terrorists” with a 70 percent accuracy rate. Problem is, these aren’t actually terrorists, they’re people pretending to be terrorists—real terrorists might well betray different physiological signs. Basically, the people reading the FAST results should be looking for people who are too calm and relaxed. Right? So no more valium for people afraid and anxious when it comes to flying. Goddamn this is scary.
And let’s not forget, or ignore, the False-Positive Paradox:
“Let’s assume for a moment that 1 in 1,000,000 people is a terrorist about to commit a crime. Terrorists are actually probably much much more rare, or we would have a whole lot more acts of terrorism, given the daily throughput of the global transportation system. Now let’s imagine the FAST algorithm correctly classifies 99.99 percent of observations—an incredibly high rate of accuracy for any big data-based predictive model. Even with this unbelievable level of accuracy, the system would still falsely accuse 99 people of being terrorists for every one terrorist it finds. Given that none of these people would have actually committed a terrorist act yet, distinguishing the innocent false positives from the guilty might be a non-trivial, and invasive task.”
FAST is a fucking nightmare of a project and whoever thought it up is a scary individual. And don’t give me the standard bullshit about, “Well if ya wanna feel safe flying then you’re gonna have to be ready to give top some of your freedoms.” F**k that … that’s bullshit. This is a slippery-slope if I’ve ever read one. Let’s say they work some of the problems out (mainly the only 70% success rate). What’s gonna stop them from using this FAST technology in other ways? I’m jumping way ahead, but why not scale the technology down and make them hand-held devices where a cop can aim it at a driver on the road and see if he/she is planning on speeding? Have FAST “pods” up in stores that monitor whether or not people are intending on shoplifting? Have FAST satellites beaming and tracking individuals who aren’t planning on paying their taxes or who say anything negative about the government? The possibilities are endless. Chances are given that FAST is, as noted, currently operating at 70% accuracy in a lab setting, the potential for false positives is too great to ignore. FAST seems to be a Shiny New Toy that will suck up lots of money and resources before someone realizes that, perhaps, the money and resources could be better spent some other way.
We can only hope, because the alternative simply scares the s**t out of me!! What do you think? Is FAST something worth looking into? Do we need to give up more liberty and personal freedom to feel safe?