Post in evidenza

Warfare Revolution

Some of the world’s leading robotics and artificial intelligence pioneers are calling on the United Nations to ban the development and us...

domenica 28 dicembre 2014

Drones Apocalypse

The drone is the ultimate imperial weapon, allowing a superpower almost unlimited reach while keeping its own soldiers far from battle,” writes New York Times reporter James Risen in his book “Pay Any Price: Greed, Power, and Endless War.”


Esistono droni di qualsiasi forma e dimensione, ma tutti hanno diversi limiti (bassa velocità, scarsa manovrabilità in spazi stretti, ecc.) che ne impediscono l’uso in determinate situazioni. La DARPA ha presentato il programma Fast Lightweight Autonomy (FLA) per la realizzazione piccoli UAV in grado di effettuare autonomamente ricerche all’interno di edifici o altri ambienti, senza comandi remoti. L’agenzia ha tratto ispirazione dai rapaci che possono volare a velocità elevate tra gli alberi di una foresta, come l’astore visibile nel video.

I militari che effettuano ricognizioni in aree pericolose durante le loro missioni e le squadre di soccorso che intervengono in caso di disastri naturali, come terremoti e inondazioni, usano i droni per osservare dall’alto situazioni e pericoli nascosti che non possono essere individuati da terra. Spesso però è necessario entrare negli edifici per cercare eventuali superstiti, ma queste azioni mettono in pericolo le loro stesse vite. Per tale motivo, la DARPA ha avviato il programma Fast Lightweight Autonomy, il cui obiettivo è creare una nuova classe di algoritmi che consentano ai droni di muoversi in un labirinto di stanze, scale e corridoi, senza pilota remoto.

L’agenzia governativa statunitense vuole progettare piccoli UAV che possono entrare negli edifici attraverso le finestre aperte e volare ad un velocità di 20 metri al secondo (72 Km/h), anche in assenza di comunicazione con i sensori e senza fare affidamento sui punti di riferimento GPS.

La DARPA vuole realizzare i droni prendendo spunto dalla natura. Gli UAV dovranno esibire capacità simili a quelle dei rapaci, come l’astore nel video, in grado di volare in una fitta foresta senza finire contro gli alberi. In caso di successo, gli algoritmi verranno applicati ai sistemi UAV esistenti.

DARPA vuole creare droni agili come rapaci Luca Colantuoni

L’Iran ha testato droni suicidi.  I test sono avvenuti nel corso delle esercitazioni militari denominate ‘Mohammad Rasoulallah’ e iniziate nello stretto di Hormuz il 25 dicembre scorso. 

L’annuncio dei test è stato dato dal generale di brigata Ahmad Reza Pourdastan. Si tratta di aerei senza piloti che possono essere usati per attaccare bersagli sia in aria e sia a terra. 

Il regime degli ayatollah ha da tempo dato enorme importanza alle ricerche e ai test su droni ritenendoli validi armi.  Durante il 2014 sono stati eseguiti numerosi test.  I test degli ultimi mesi dimostrano che l’Iran ha ottenuto grandi successi nel suo settore difesa ed ha raggiunto l’autosufficienza nella produzione di attrezzature e sistemi militari essenziale.

Drones provide remote-control combat, custom-designed for wars of choice, and they have become the signature weapons of the war on terror.
But America’s monopoly on death from a distance is coming to an end. Drone technology is relatively simple and cheap to acquire — which is why more than 70 countries, plus nonstate actors like Hezbollah, have combat drones.
The National Journal’s Kristin Roberts imagines how drones could soon “destabilize entire regions and potentially upset geopolitical order”: “Iran, with the approval of Damascus, carries out a lethal strike on anti-Syrian forces inside Syria; Russia picks off militants tampering with oil and gas lines in Ukraine or Georgia; Turkey arms a U.S.-provided Predator to kill Kurdish militants in northern Iraq who it believes are planning attacks along the border.
Label the targets as terrorists, and in each case, Tehran, Moscow and Ankara may point toward Washington and say, we learned it by watching you. In Pakistan, Yemen and Afghanistan.”
Next: SkyNet.
SkyNet, you recall from the Terminator movies, is a computerized defense network whose artificial intelligence programming leads it to self-awareness. People try to turn it off; SkyNet interprets this as an attack — on itself. Automated genocide follows in an instant.
In an article you should read carefully because/despite that fact that it will totally freak you out, The New York Times reports that “arms makers … are developing weapons that rely on artificial intelligence, not human instruction, to decide what to target and whom to kill.”
More from the Times piece:
“Britain, Israel and Norway are already deploying missiles and drones that carry out attacks against enemy radar, tanks or ships without direct human control. After launch, so-called autonomous weapons rely on artificial intelligence and sensors to select targets and to initiate an attack.
“Britain’s ‘fire and forget’ Brimstone missiles, for example, can distinguish among tanks and cars and buses without human assistance, and can hunt targets in a predesignated region without oversight. The Brimstones also communicate with one another, sharing their targets. …
“Israel’s antiradar missile, the Harpy, loiters in the sky until an enemy radar is turned on. It then attacks and destroys the radar installation on its own.
“Norway plans to equip its fleet of advanced jet fighters with the Joint Strike Missile, which can hunt, recognize and detect a target without human intervention.”
“An autonomous weapons arms race is already taking place,” says Steve Omohundro, a physicist and AI specialist at Self-Aware Systems. “They respond faster, more efficiently and less predictably.”
As usual, the United States is leading the way toward dystopian apocalypse, setting precedents for the use of sophisticated, novel, more efficient killing machines. We developed and dropped the first nuclear bombs. We unleashed the drones. Now we’re at the forefront of AI missile systems.
The first test was a disaster: “Back in 1988, the Navy test-fired a Harpoon antiship missile that employed an early form of self-guidance. The missile mistook an Indian freighter that had strayed onto the test range for its target. The Harpoon, which did not have a warhead, hit the bridge of the freighter, killing a crew member.”
But we’re America! We didn’t let that slow us down: “Despite the accident, the Harpoon became a mainstay of naval armaments and remains in wide use.”
U-S-A! U-S-A!
I can see you tech geeks out there, shaking your heads over your screen, saying to yourselves: “Rall is paranoid! This is new technology. It’s bound to improve. AI drones will become more accurate.”
Not necessarily.
Combat drones have hovered over towns and villages in Afghanistan and Pakistan for the last 13 years, killing thousands of people.
The accuracy rate is less than impressive: 3.5 percent. 96.5 percent of the victims are, by the military’s own assessment, innocent civilians.
The Pentagon argues that its new generation of self-guided hunter-killers are merely “semi-autonomous” and so don’t run afoul of a U.S. rule against such weapons. But only the initial launch is initiated by a human being.
“It will be operating autonomously when it searches for the enemy fleet,” Mark Gubrud, a physicist who is a member of the International Committee for Robot Arms Control, told the Times. “This is pretty sophisticated stuff that I would call artificial intelligence outside human control.”
If that doesn’t worry you, this should: It’s only a matter of time before other countries, some of which don’t like us, get these too.
Not much time.

‘Game of drones’ draws us toward apocalypse 2014/12/07



Robot Apocalypse 9 MARZO 2017 

Posta un commento
Share/Save/Bookmark
Related Posts Plugin for WordPress, Blogger...

Ratings by outbrain

PostRank