{"id":8844,"date":"2017-08-21T14:45:47","date_gmt":"2017-08-21T22:45:47","guid":{"rendered":"http:\/\/www.palada.net\/index.php\/2017\/08\/21\/news-2617\/"},"modified":"2017-08-21T14:45:47","modified_gmt":"2017-08-21T22:45:47","slug":"news-2617","status":"publish","type":"post","link":"http:\/\/www.palada.net\/index.php\/2017\/08\/21\/news-2617\/","title":{"rendered":"Sorry, Banning \u2018Killer Robots\u2019 Just Isn\u2019t Practical"},"content":{"rendered":"<p><img decoding=\"async\" src=\"https:\/\/media.wired.com\/photos\/599b55e8a87691355b28ecad\/master\/pass\/KillerRobots-AP_17124602633661.jpg\"\/><\/p>\n<p><strong>Credit to Author: Tom Simonite| Date: Mon, 21 Aug 2017 22:23:23 +0000<\/strong><\/p>\n<p data-reactid=\"247\"><span class=\"lede\" data-reactid=\"248\"><!-- react-text: 249 -->Late Sunday, 116 <!-- \/react-text --><\/span><!-- react-text: 250 -->entrepreneurs, including Elon Musk, released a letter to the United Nations warning of the dangerous \u201cPandora\u2019s Box\u201d presented by weapons that make their own decisions about when to kill. Publications including <!-- \/react-text --><a href=\"https:\/\/web.archive.org\/web\/20170820164709\/https:\/\/www.theguardian.com\/technology\/2017\/aug\/20\/elon-musk-killer-robots-experts-outright-ban-lethal-autonomous-weapons-war\" target=\"_blank\" data-reactid=\"251\"><em data-reactid=\"252\"><!-- react-text: 253 -->The Guardian<!-- \/react-text --><\/em><\/a><!-- react-text: 254 --> and <!-- \/react-text --><a href=\"https:\/\/www.washingtonpost.com\/news\/innovations\/wp\/2017\/08\/21\/elon-musk-calls-for-ban-on-killer-robots-before-weapons-of-terror-are-unleashed\/?utm_term=.c2f5db0a3162\" target=\"_blank\" data-reactid=\"255\"><em data-reactid=\"256\"><!-- react-text: 257 -->The Washington Post<!-- \/react-text --><\/em><\/a><!-- react-text: 258 --> ran headlines saying Musk and his cosigners had called for a \u201cban\u201d on \u201ckiller robots.\u201d<!-- \/react-text --><\/p>\n<p data-reactid=\"259\"><!-- react-text: 260 -->Those headlines were misleading. <!-- \/react-text --><a href=\"https:\/\/futureoflife.org\/autonomous-weapons-open-letter-2017\" target=\"_blank\" data-reactid=\"261\"><!-- react-text: 262 -->The letter<!-- \/react-text --><\/a><!-- react-text: 263 --> doesn\u2019t explicitly call for a ban, although one of the organizers <!-- \/react-text --><a href=\"https:\/\/newsroom.unsw.edu.au\/news\/science-tech\/world%E2%80%99s-tech-leaders-urge-un-ban-killer-robots\" target=\"_blank\" data-reactid=\"264\"><!-- react-text: 265 -->has suggested<!-- \/react-text --><\/a><!-- react-text: 266 --> it does. Rather, it offers technical advice to a UN committee on autonomous weapons formed in December. The group\u2019s warning that autonomous machines \u201ccan be weapons of terror\u201d makes sense. But trying to ban them outright is probably a waste of time.<!-- \/react-text --><\/p>\n<p data-reactid=\"267\"><!-- react-text: 268 -->That\u2019s not because it\u2019s impossible to ban weapons technologies. Some 192 nations have signed the Chemical Weapons Convention that bans chemical weapons, for example. An international agreement blocking use of laser weapons intended to cause permanent blindness is holding up nicely.<!-- \/react-text --><\/p>\n<p data-reactid=\"269\"><!-- react-text: 270 -->Weapons systems that make their own decisions are a very different, and much broader category. The line between weapons controlled by humans and those that fire autonomously is blurry, and many nations\u2014including the US\u2014have begun the process of crossing it. Moreover, technologies such as robotic aircraft and ground vehicles have proved so useful that armed forces may find giving them more independence\u2014including to kill\u2014irresistible.<!-- \/react-text --><\/p>\n<p class=\"article-list-item-embed-component__title\" data-reactid=\"283\">AI Could Revolutionize War as Much as Nukes<\/p>\n<p class=\"article-list-item-embed-component__title\" data-reactid=\"293\">The Myth of a Superhuman AI<\/p>\n<p class=\"article-list-item-embed-component__title\" data-reactid=\"303\">Two Giants of AI Team Up to Head Off the Robot Apocalypse<\/p>\n<p data-reactid=\"304\"><!-- react-text: 305 -->A recent report on artificial intelligence and war commissioned by the Office of the Director of National Intelligence concluded that the technology is set to <!-- \/react-text --><a href=\"https:\/\/www.wired.com\/story\/ai-could-revolutionize-war-as-much-as-nukes\/\" data-reactid=\"306\"><!-- react-text: 307 -->massively magnify military power<!-- \/react-text --><\/a><!-- react-text: 308 -->. Greg Allen, coauthor of the report and now an adjunct fellow at nonpartisan think tank the Center for New American Security, doesn\u2019t expect the US and other countries to be able to stop themselves from building arsenals of weapons that can decide when to fire. \u201cYou are unlikely to achieve a full ban of autonomous weapons,\u201d he says. \u201cThe temptation for using them is going to be very intense.\u201d<!-- \/react-text --><\/p>\n<p data-reactid=\"311\"><!-- react-text: 312 -->The US Department of Defense does have a policy to keep a \u201chuman in the loop\u201d when deploying lethal force. But it hasn\u2019t suggested it would be open to international agreement banning autonomous weapons. The Pentagon did not immediately respond to a request for comment Monday. In 2015, the UK government responded to calls for a ban on autonomous weapons by saying there was no need for one, and that existing international law was sufficient.<!-- \/react-text --><\/p>\n<p data-reactid=\"313\"><!-- react-text: 314 -->You don\u2019t have to look far to find weapons already making their own decisions to some degree. One is the AEGIS ship-based missile and aircraft-defense system used by the US Navy. It is capable of engaging approaching planes or missiles without human intervention, according to a <!-- \/react-text --><a href=\"https:\/\/s3.amazonaws.com\/files.cnas.org\/documents\/Ethical-Autonomy-Working-Paper_021015_v02.pdf\" target=\"_blank\" data-reactid=\"315\"><!-- react-text: 316 -->CNAS report<!-- \/react-text --><\/a><!-- react-text: 317 -->.<!-- \/react-text --><\/p>\n<p data-reactid=\"318\"><!-- react-text: 319 -->Other examples include a drone called the Harpy, developed in Israel, which patrols an area searching for radar signals. If it detects one, it automatically dive-bombs the signal\u2019s source. Manufacturer Israeli Aerospace Industries markets the Harpy as a \u201c<!-- \/react-text --><a href=\"http:\/\/www.iai.co.il\/2013\/36694-16153-en\/Business_Areas_Land.aspx\" target=\"_blank\" data-reactid=\"320\"><!-- react-text: 321 -->\u2018Fire and Forget\u2019 autonomous weapon<!-- \/react-text --><\/a><!-- react-text: 322 -->.\u201d<!-- \/react-text --><\/p>\n<p data-reactid=\"323\"><!-- react-text: 324 -->Musk signed an earlier letter in 2015 alongside thousands of AI experts in academia and industry that called for a <!-- \/react-text --><a href=\"https:\/\/futureoflife.org\/open-letter-autonomous-weapons\/\" target=\"_blank\" data-reactid=\"325\"><!-- react-text: 326 -->ban on offensive use of autonomous weapons<!-- \/react-text --><\/a><!-- react-text: 327 -->. Like Sunday\u2019s letter, it was coordinated by the Future of Life Institute, an organization that ponders long-term effects of AI and other technologies, and to which Musk has gifted $10 million.<!-- \/react-text --><\/p>\n<p data-reactid=\"328\"><!-- react-text: 329 -->WIRED couldn\u2019t reach the institute to ask why the new letter took a different tack. But Rebecca Crootof, a researcher at Yale Law School, says people concerned about autonomous weapons systems should consider more constructive alternatives to campaigning for a total ban.<!-- \/react-text --><\/p>\n<p data-reactid=\"330\"><!-- react-text: 331 -->\u201cThat time and energy would be much better spent developing regulations,\u201d she says. International laws such as the Geneva Convention that restrict the activities of human soldiers could be adapted to govern what robot soldiers can do on the battlefield, for example. Other regulations short of a ban could try to clear up the murky question of who is held legally accountable when a piece of software makes a bad decision, for example by killing civilians.<!-- \/react-text --><\/p>\n<p class=\"related-cne-video-component__dek\" data-reactid=\"341\">No longer just for aerial photography and stunt aerobatics, hobbyists have found a new use for drones: making them fight.   <\/p>\n<p><a href=\"https:\/\/www.wired.com\/story\/sorry-banning-killer-robots-just-isnt-practical\" target=\"bwo\" >https:\/\/www.wired.com\/category\/security\/feed\/<\/a><\/p>\n","protected":false},"excerpt":{"rendered":"<p><img decoding=\"async\" src=\"https:\/\/media.wired.com\/photos\/599b55e8a87691355b28ecad\/master\/pass\/KillerRobots-AP_17124602633661.jpg\"\/><\/p>\n<p><strong>Credit to Author: Tom Simonite| Date: Mon, 21 Aug 2017 22:23:23 +0000<\/strong><\/p>\n<p>Elon Musk and others seek restrictions on use of autonomous weapons<\/p>\n","protected":false},"author":4,"featured_media":0,"comment_status":"open","ping_status":"closed","sticky":false,"template":"","format":"standard","meta":{"colormag_page_container_layout":"default_layout","colormag_page_sidebar_layout":"default_layout","footnotes":""},"categories":[10378,10607],"tags":[1001,714],"class_list":["post-8844","post","type-post","status-publish","format-standard","hentry","category-security","category-wired","tag-business","tag-security"],"_links":{"self":[{"href":"http:\/\/www.palada.net\/index.php\/wp-json\/wp\/v2\/posts\/8844","targetHints":{"allow":["GET"]}}],"collection":[{"href":"http:\/\/www.palada.net\/index.php\/wp-json\/wp\/v2\/posts"}],"about":[{"href":"http:\/\/www.palada.net\/index.php\/wp-json\/wp\/v2\/types\/post"}],"author":[{"embeddable":true,"href":"http:\/\/www.palada.net\/index.php\/wp-json\/wp\/v2\/users\/4"}],"replies":[{"embeddable":true,"href":"http:\/\/www.palada.net\/index.php\/wp-json\/wp\/v2\/comments?post=8844"}],"version-history":[{"count":0,"href":"http:\/\/www.palada.net\/index.php\/wp-json\/wp\/v2\/posts\/8844\/revisions"}],"wp:attachment":[{"href":"http:\/\/www.palada.net\/index.php\/wp-json\/wp\/v2\/media?parent=8844"}],"wp:term":[{"taxonomy":"category","embeddable":true,"href":"http:\/\/www.palada.net\/index.php\/wp-json\/wp\/v2\/categories?post=8844"},{"taxonomy":"post_tag","embeddable":true,"href":"http:\/\/www.palada.net\/index.php\/wp-json\/wp\/v2\/tags?post=8844"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}