Center for Strategic Communication

Hat tip for a strong recommendation from Adam Elkus:

Josh Foust has a very sensible piece up about the seemingly endless furor about “killer drones” (we never called our warplanes “Killer F-16′s” or guided weapons “killer cruise missiles”).

The false fear of autonomous weapons 

….Many of the processes that go into making lethal decisions are already automated. The intelligence community (IC) generates around 50,000 pages of analysis each year, culled from hundreds of thousands of messages. Every day analysts reviewing targeting intelligence populate lists for the military and CIA via hundreds of pages of documents selected by computer filters and automated databases that discriminate for certain keywords.

In war zones, too, many decisions to kill are at least partly automated. Software programs such as Panatir collect massive amounts of information about IEDs, analyze without human input, and spit out lists of likely targets. No human could possibly read, understand, analyze, and output so much information in such a short period of time.

Automated systems already decide to fire at targets without human input, as well. The U.S. Army fields advanced counter-mortar systems that track incoming mortar rounds, swat them out of the sky, and fire a return volley of mortars in response without any direct human input. In fact, the U.S. has employed similar (though less advanced) automated defensive systems for decades aboard its navy vessels. Additionally, heat-seeking missiles don’t require human input once they’re fired – on their own, they seek out and destroy the nearest intense heat source regardless of identity.

It’s hard to see how, in that context, a drone (or rather the computer system operating the drone) that automatically selects a target for possible strike is morally or legally any different than weapons the U.S. already employs.


Most of the anti-drone arguments are a third hand form of opposition to US foreign policy or Counterterrorism policy for a variety of reasons, sometimes tactical and strategic, but mostly just political. Saying you are against inhuman drone strikes sounds a hell of a lot better than honestly saying that you would be against any kind of effective use of military force by the US against al Qaida and the Taliban in any and all circumstances. I can’t imagine Human Rights Watch would be happier if the US were using F-16′s and B-52′s instead.

Or commandos with small arms for that matter.