Report Cites Dangers of Autonomous Weapons

Feb 29, 2016

Photo credit: Kim Hong-Ji/Reuters

By John Markoff

A new report written by a former Pentagon official who helped establish United States policy on autonomous weapons argues that such weapons could be uncontrollable in real-world environments where they are subject to design failure as well as hacking, spoofing and manipulation by adversaries.

In recent years, low-cost sensors and new artificial intelligence technologies have made it increasingly practical to design weapons systems that make killing decisions without human intervention. The specter of so-called killer robots has touched off an international protest movement and a debate within the United Nations about limiting the development and deployment of such systems.

The new report was written by Paul Scharre, who directs a program on the future of warfare at the Center for a New American Security, a policy research group in Washington, D.C. From 2008 to 2013, Mr. Scharre worked in the office of the Secretary of Defense, where he helped establish United States policy on unmanned and autonomous weapons. He was one of the authors of a 2012 Defense Department directive that set military policy on the use of such systems.

In the report, titled “Autonomous Weapons and Operational Risk,” set to be published on Monday, Mr. Scharre warns about a range of real-world risks associated with weapons systems that are completely autonomous.

The report contrasts these completely automated systems, which have the ability to target and kill without human intervention, to weapons that keep humans “in the loop” in the process of selecting and engaging targets.

Mr. Scharre, who served as an Army Ranger in Iraq and Afghanistan, focuses on the potential types of failures that might occur in completely automated systems, as opposed to the way such weapons are intended to work. To underscore the military consequences of technological failures, the report enumerates a history of the types of failures that have occurred in military and commercial systems that are highly automated.

“Anyone who has ever been frustrated with an automated telephone call support helpline, an alarm clock mistakenly set to ‘p.m.’ instead of ‘a.m.,’ or any of the countless frustrations that come with interacting with computers, has experienced the problem of ‘brittleness’ that plagues automated systems,” Mr. Scharre writes.


Continue reading by clicking the name of the source below.

6 comments on “Report Cites Dangers of Autonomous Weapons

  • 1
    Stardusty Psyche says:

    ‘brittleness’ that plagues automated systems,” Mr. Scharre writes.

    I totally agree. We are nowhere near any kind of reliable fully autonomous weapon.

    We have long had “fire and forget” weapons, wherein the target is identified by a human being, the weapon is deployed, and it automatically guides itself to the target. The accuracy and reliability of these systems has improved continually, but they still fail and are still subject to countermeasures.

    We have remotely piloted weapons, but again there is a human being in the loop. Still, mistakes can and do happen.

    The idea of sending a fully autonomous system over the horizon on an independent search and destroy mission is, or at least should be, out of the question.



    Report abuse

  • I dread the day when these errors are dismissed by humans saying, “that is between the drone and its god”. Thinking up ways to absolve us from war is a sin in anybodies language.



    Report abuse

  • 3
    NearlyNakedApe says:

    @stardustypsyche

    I totally agree. We are nowhere near any kind of reliable fully autonomous weapon.

    I concur and I also believe that we’ll never reach full reliability because of the reasons stated by Mr. Scharre. I worked in electronics for many years and I know how sensitive and finicky some systems can be. Add a wireless control link and you’re begging to be hacked. Don’t implement one and if the robotic weapon goes haywire, you have no safety net. And just like that, “reliability” goes right out the window.

    I would also argue that even if they were 100% reliable, the mere existence of such weapons is way too dangerous and riddled with far-reaching consequences to even be considered. May these technological monsters never see the light of day.



    Report abuse

  • NearlyNakedApe #3
    Feb 29, 2016 at 9:42 pm

    I would also argue that even if they were 100% reliable, the mere existence of such weapons is way too dangerous and riddled with far-reaching consequences to even be considered. May these technological monsters never see the light of day.

    . . . . and that’s even before we look at the ineptitude of the politicians and commanders directing military operations!



    Report abuse

  • The problem with autonomous weapons is when they prove less competent to make that fully rounded decision needed. John Carpenter, in 1974, gave us a smart alec bomb that was, nevertheless, thoroughly Dunning Kruger.

    Creators grant free-will at their peril.

    https://www.youtube.com/watch?v=29pPZQ77cmI

    On a further point (as with autonomous vehicles) will these devices be able to solve the trolley problem (say the fatman variant…wiki), saving five by sacrificing one? These potentially more moral solutions often demand a “house trained” psychopath for their reliable execution, one free of the fast but simple-minded moral heuristics which intuitively govern us first, yet one trained in utilitarian arithmetic. Should bombs be psychopathically moral?



    Report abuse

  • @OP- To underscore the military consequences of technological failures, the report enumerates a history of the types of failures that have occurred in military and commercial systems that are highly automated.

    Once military robots are provided with 3D printing kits to autonomously make repairs and their own spare parts, we are moving into the realms of “Terminator” Sci-fi! – Regardless of how cute early models may look while developing this technology!

    http://3dprintingindustry.com/2015/09/10/3d-print-build-learn-to-use-your-very-own-robot/



    Report abuse

Leave a Reply

View our comment policy.