Post in evidenza

Warfare Revolution

Some of the world’s leading robotics and artificial intelligence pioneers are calling on the United Nations to ban the development and us...

lunedì 13 aprile 2015

MIND THE GAP Nuovo appello contro i robot killer


Human Rights Watch stila l'ennesimo rapporto teso a condannare i rischi insiti nelle armi automatiche e nei robot dotati della capacità di uccidere, tecnologia che va assolutamente messa al bando prima ancora che divenga realtà

Roma - L'organizzazione Human Rights Watch (HRW) torna ancora una volta ad affrontare lo spinoso argomento dei robot killer, macchine automatiche che presto trasformeranno gli scenari di guerra con enormi implicazioni legali, morali e diplomatiche.


Nel nuovo rapporto Mind the Gap, l'organizzazione che si batte per i diritti umani ha lavorato assieme agli esperti della Harvard Law School per identificare le conseguenze derivanti dall'uso dei droni e dei robot killer - soprattutto per la mancanza di responsabilità in merito alle vittime delle nuove armi robotiche.


I robot killer sfuggono completamente al concetto di responsabilità personale, spiega il rapporto di HRW, e le leggi attualmente in vigore in ogni parte del mondo non prevedono che a pagare per gli eventuali crimini di guerra sia chi ha progettato, costruito o "acceso" quelle macchine assassine.

Anche se al momento non esistono, le macchine completamente automatiche saranno presto realtà e si troveranno in una posizione legale assolutamente ambigua: senza responsabilità, spiega il rapporto, non ci potrà essere deterrenza per i crimini futuri, compensazione per le vittime, condanna sociale per i criminali.


La ricerca, il finanziamento e lo sviluppo sui robot killer vanno messi al bando a livello internazionale, sostiene HRW, e per meglio promuovere questo obiettivo l'organizzazione ha già avviato una campagna sostenuta da più di 50 organizzazioni non governative.


Il nuovo rapporto di HRW arricchisce ulteriormente il dibattito sulle macchine assassine, una discussione in cui ovviamente non mancano i possibilisti - che considerano i sistemi di difesa autonomi un asset importante dal punto di vista militare - e chi, preso spunto dalla cinematografia sci-fi (Humandroid), pensa piuttosto a istruire l'Intelligenza Artificiale del futuro molto prima che l'IA imbracci fucile e armi laser.

Robot killer, nuovo appello all'ONU  Alfonso Maruccia 13 aprile 2015


A protest takes place outside the London offices of the defence contractor General Atomics against drones and killer robots. Photograph: Peter Marshall/Demotix/Corbis


Fully autonomous weapons, already denounced as “killer robots”, should be banned by international treaty before they can be developed, a new report urges the United Nations .


Under existing laws, computer programmers, manufacturers and military commanders would all escape liability for deaths caused by such machines, according to the study published on Thursday by Human Rights Watch and Harvard Law School.


Nor is there likely to be any clear legal framework in future that would establish the responsibility of those involved in producing or operating advanced weapons systems, say the authors of Mind the Gap: The Lack of Accountability for Killer Robots.


The report is released ahead of an international meeting on lethal autonomous weapons systems at the UN in Geneva starting on 13 April. The session will discuss additions to the convention on certain conventional weapons.


Also known as the inhumane weapons convention, the treaty has been regularly reinforced by new protocols on emerging military technology. Blinding laser weapons were pre-emptively outlawed in 1995 and combatant nations since 2006 have been required to remove unexploded cluster bombs.


Military deployment of the current generation of drones is defended by the Ministry of Defence and other governments on the grounds that there is always a man or woman “in the loop”, ultimately deciding whether or not to trigger a missile.


Rapid technical progress towards the next stage of automation, in which weapons may select their own targets, has alarmed scientists and human rights campaigners.


“Fully autonomous weapons do not yet exist,” the report acknowledges. “But technology is moving in their direction, and precursors are already in use or development. For example, many countries use weapons defence systems – such as the Israeli Iron Dome and the US Phalanx and C-RAM – that are programmed to respond automatically to threats from incoming munitions.


“Prototypes exist for planes that could autonomously fly on intercontinental missions [the UK’s Taranis] or take off and land on an aircraft carrier [the US’s X-47B].


The lack of meaningful human control places fully autonomous weapons in an ambiguous and troubling position. On the one hand, while traditional weapons are tools in the hands of human beings, fully autonomous weapons, once deployed, would make their own determinations about the use of lethal force.


“They would thus challenge longstanding notions of the role of arms in armed conflict, and for some legal analyses, they would be more akin to a human soldier than to an inanimate weapon. On the other hand, fully autonomous weapons would fall far short of being human.”


The report calls for a prohibition “on the development, production and use of fully autonomous weapons through an international legally binding” agreement, and urges states to adopt similar domestic laws.

The hurdles to accountability for the production and use of fully autonomous weapons under current law are monumental, the report states. “Weapons could not be held accountable for their conduct because they could not act with criminal intent, would fall outside the jurisdiction of international tribunals and could not be punished.

“Criminal liability would likely apply only in situations where humans specifically intended to use the robots to violate the law. In the United States at least, civil liability would be virtually impossible due to the immunity granted by law to the military and its contractors and the evidentiary obstacles to products liability suits.” 

Bonnie Docherty, HRW’s senior arms division researcher and the report’s lead author, said: “No accountability means no deterrence of future crimes, no retribution for victims, no social condemnation of the responsible party. The many obstacles to justice for potential victims show why we urgently need to ban fully autonomous weapons.”• 
Human Rights Watch is a co-founder of the Campaign to Stop Killer Robots, which is supported by more than 50 NGOs and supports a preemptive ban on the development, production and use of fully autonomous weapons.

UN urged to ban 'killer robots' before they can be developed Owen Bowcott 9 April 2015

Stop the Robots La protesta anti-macchine 19 MARZO 2015

Le Leggi della Roboetica 9 MARZO 2015

ROBOT WAR 27 FEBBRAIO 2015



Sociopathic Robots 22 GENNAIO 2015

DRONES APOCALYPSE 28 DICEMBRE 2014


STOP KILLER ROBOTS 6 MAGGIO 2013

OUT OF CONTROL 28 FEBBRAIO 2008

Posta un commento
Share/Save/Bookmark
Related Posts Plugin for WordPress, Blogger...

Ratings by outbrain

PostRank