THE A.I. THOUGHT POLICE
Source: Activist Post
Sunday, September 23, 2012
The A.I. Thought Police
Julie Beal, Contributor
We know we’re being surveilled in the matrix: AI Law, empowered by algorithms, feasts on Twittered hate crimes and the like to try to predict crime. We are leaving virtual trails of data which are used to feed simulation models, for predictive analytics – but we can still opt out, throw away our phones, disconnect. There is still some control over what they take from us. But AI Thought Police wants more – to climb into our minds, understand our physical make-up, really get to know us.
We are all under suspicion, but AI needs to know which ones to focus on. So the US army is developing methods to covertly identify and track people who plan to do ‘something bad’. Hidden sensors will be used to detect AI’s version of ‘adversarial intent’ by reading and cataloguing our emotions and health.
A report called ‘Remote Detection of Covert Tactical Adversarial Intent of Individuals in Asymmetric Operations’ was authored by the US Army Research Laboratory in 2010; it details the requirements of researchers wishing to gain funding from the US Federal Government by developing techniques to hone in on individuals in crowds, to detect antagonistic attitudes among the “clutter” of innocents. The prime directive to protect national security, counter ‘insurgency’, and generally ‘keep the peace’, however, means the technology that is developed will spread beyond airports and be used for wider civilian applications, such as “crowd control and in antidrug, anticrime, and immigration enforcement.” In fact, applications in the civilian economy are said to be plentiful, and also include “border security, and ensuring the security of government and private personnel and property”.
The report points out that fusion of information of different types is getting better, such as combining data obtained with a laser with photographic or video images. This technique can help reveal “seemingly hidden patterns”; but the army wants the data to be detected from at least 3m away, and preferably up to 50m away, for use in “asymmetric defense scenarios”.
The report is basically a solicitation for research and development initiatives which can unobtrusively monitor physiological, biological, and physical reactions, and other behavioural signatures (facial expressions, eye movements, gestures, heart rate, perspiration, anomalous behaviour, etc.) in order to deduce thoughts and emotions. From this they hope to be able to detect adversarial intent, but the subjects mustn’t know they are being studied. Several trials are noted to have attempted to do this, and include:
* Future Attribute Screening Technology (FAST) - Department of Homeland Security, DHS)
* Violent Intent Modeling and Simulation (VIMS) – DHS
* Detection of Intent through Perception of Biomotion Signatures, and * Visualization of Belief Systems - U.S. Army Research Laboratory Human Research and Engineering Directorate, ARL/HRED
* Remote and Passive ID of Electrodermal Response - Night Vision and Electron Sensors Directorate, NVESD
* Behavioral Signatures, and Human MASINT - U.S. Air Force Research Laboratory, AFRL
* Hostile Intent - U.S. Naval Research Laboratory, NRL
* Computational Modeling of Adversary Attitudes and Behaviors - U.S. Air Force Office of Scientific Research, AFOSR, George Mason University
* Dynamic, Adaptive Techniques for Adversary Behavior Modeling - AFOSR, University of Maryland, U MD
* Human, Social, Cultural, and Behavioral Modeling - U.S. Army Research Office, ARO, Carnegie Mellon University, U MD
* Tools for Recognizing Unconscious Signals of Trustworthiness Program (TRUST) - Intelligence Advanced Research Projects Activity, IARPA
The main directive of the army report is to develop “theoretically justified quantitative predictive principles (models) and their implementation in tractable analytical and computational procedures.” In other words, it’s all about crunching numbers using algorithms and really is very ambitious, because it will go beyond mere physiological signs such as fear or stress, to try to pinpoint cognitive intent, even from people who show no outwards signs. This is also the trend in law enforcement, just like in the Minority Report, where predictive analytics are used to make sense of surveillance data – we are empowering machines to be the Thought Police.
Researchers are expected to look to knowledge gained from:
* cognitive science
* sociocultural anthropology, and
The latest knowledge from each of these disciplines, of detectable signs of ‘(mal)intent’, is to be pooled together with the best innovations in remote bio-sensing techniques. The proposed list of characteristics to track and measure is:
* Posture rigidity
* Heartbeat waveform
* Heart rate
* Breath rate, volume approximation, patterns, anomalies
Wheezing, coughing, gasping
* Blood pressure trends: waveform shape and transit time
* Pulse-wave velocity giving a beat-by-beat approximation of blood pressure
* Movement: fidgeting, remaining still, shaking, shivering, having spasms
* Body stiffness, muscle tension, resonant frequency of body movement
* Voice stress analysis and voice onset timing
* Gastrointestinal distress, bowel sounds
* Reluctance to engage socially: distance from others, response to attempts to engage verbally
* Observation tendencies of subject, eye-glancing, head turning, situational awareness
* People whose actions are coordinated or who are actively avoiding each other
* Exposure to bomb-making materials/chemicals
* Hyperthermia from stress (generally expressed in the face, palms of the hands, and soles of the feet)
* Gait as indicators of stiffness (stress) or carrying a load or wearing protective clothing
* Breath biochemistry
* Microbiological organisms on skin or clothing
It is also suggested that as well as taking such measurements from people in a relaxed state, little things should be put in their way to see how they respond. This “active elicitation” of information should be subtle, so as not to arouse suspicion and skew the results, because it involves presenting a stimulus to the subject and analysing the behavioural response, i.e. seeing how the person reacts to a certain noise, picture, person, etc. - measuring physiological and emotive responses from a distance. These little incidents designed to elicit a response are called ‘perturbations’, and can be physical or psychological. The examples given include putting an obstacle in the path of the subject so as to cause a change in their way of moving, or flashing a picture of the subject onto a TV screen, “to let them know that they are being watched”.
- THE A.I. THOUGHT POLICE - Mother_Of_The_World, 2012-09-24, 15:21