Coming Soon From the Air Force: Mind-Reading Drones

Scientifically speaking, it’s only a matter of time before drones become self-aware and kill us all. Now the Air Force is hastening that day of reckoning. Buried within a seemingly innocuous list of recent Air Force contract awards to small businesses are details of plans for robot planes that not only think, but anticipate the […]


Scientifically speaking, it's only a matter of time before drones become self-aware and kill us all. Now the Air Force is hastening that day of reckoning.

Buried within a seemingly innocuous list of recent Air Force contract awards to small businesses are details of plans for robot planes that not only think, but anticipate the moves of human pilots. And you thought it was just the Navy that was bringing us to the brink of the drone apocalypse.

It all starts with a solution for a legitimate problem. It's dangerous to fly and land drones at busy terminals. Manned airplanes can collide with drones, which may not be able to make quick course adjustments based on information from air traffic control as swiftly as a human pilot can. And getting air traffic control involved in the drones cuts against the desire for truly autonomous aircraft. What to do?

The answer: Design an algorithm that reads people's minds. Or the next best thing -- anticipates a pilot's reaction to a drone flying too close.

Enter Soar Technology, a Michigan company that proposes to create something it calls "Explanation, Schemas, and Prediction for Recognition of Intent in the Terminal Area of Operations," or ESPRIT. It'll create a "Schema Engine" that uses "memory management, pattern matching, and goal-based reasoning" to infer the intentions of nearby aircraft.

Not presuming that every flight will go according to plan, the Schema Engine's "cognitive explanation mechanism" will help the drone figure out if a pilot is flying erratically or out of control. The Air Force signed a contract Dec. 23 with Soar, whose representatives were not reachable for comment.

And Soar's not the only one. California-based Stottler Henke Associates argues that one algorithm won't get the job done. Its rival proposal, the Intelligent Pilot Intent Analysis System would "represent and execute expert pilot-reasoning processes to infer other pilots’ intents in the same way human pilots currently do." The firm doesn't say how its system will work, and it's yet to return an inquiry seeking explanation. A different company, Barron Associates, wants to use sensors as well as algorithms to avoid collision.

And Stottler Henke is explicitly thinking about how to weaponize its mind-reading program. "Many of the pilot-intent-analysis techniques described are also applicable for determining illegal intent and are therefore directly applicable to finding terrorists and smugglers," it told the Air Force. Boom: deal inked on Jan. 7.

Someone's got to say it. Predicting a pilot's intent might prevent collisions. But it can also neutralize a human counterattack. Or it can allow the drones' armed cousins to mimic Israel in the Six Day War and blow up the manned aircraft on the tarmac. Coincidentally, according to the retcon in Terminator: The Sarah Connor Chronicles, April 19, 2011 -- today -- is the day that Skynet goes online. Think about it.

The Air Force theorist Col. John Boyd created the concept of an "OODA Loop," for "Observation, Orientation, Decision and Action" to guide pilots' operations. Never would he have thought one of his loops would be designed into the artificial brain of an airborne robot.

Photo: Spencer Ackerman

See Also: