Quantcast
Channel: The Controversial Files
Viewing all articles
Browse latest Browse all 4246

UN Warns Of Killer Robots, May Pose Threat To Peace And Should Be Banned

$
0
0
UN Warns Of Killer Robots, May Pose Threat To Peace And Should Be Banned

"Killer robots" that could attack targets autonomously without a human pulling the trigger pose a threat to international stability and should be banned before they come into existence, the United Nations will be told by its human rights investigator this week.

Christof Heyns, the UN special rapporteur on extrajudicial, summary or arbitrary executions, will address the UN Human Rights Council in Geneva on Thursday and call for a worldwide moratorium on what he calls "lethal autonomous robotics" – weapons systems that, once activated, can lock on and kill targets without further involvement of human handlers.

"Machines lack morality and mortality, and as a result should not have life and death powers over humans," Heyns will say.

Heyns's call for a moratorium draws the UN into the realms of sci-fi: fully autonomous weapons have not yet been developed, and exist only in the imaginations of military planners. However, experts in warfare technologies warn that the world's leading military powers are moving so rapidly in this direction that a pre-emptive ban is essential.

UN Warns Of Killer Robots, May Pose Threat To Peace And Should Be Banned

"States are working towards greater and greater autonomy in weapons, and the potential is there for such technologies to be developed in the next 10 or 20 years," said Bonnie Docherty of Harvard law school's International Human Rights Clinic, who co-authored a report on the subject with Human Rights Watch.

In his submission to the UN, Heyns points to the experience of drones. Unmanned aerial vehicles were intended initially only for surveillance, and their use for offensive purposes was prohibited, yet once strategists realised their perceived advantages as a means of carrying out targeted killings, all objections were swept out of the way.

Drone technology has already moved a step closer to a fully autonomous state in the form of the X-47B, a super-charged UAV developed by the US Navy that can fly itself, and which last week completed the first takeoff from an aircraft carrier. The drone is billed as a non-combat craft, yet its design includes two weapons bays capable of carrying more than 4,000lbs.

Britain is developing its own next generation of drone, known as Taranis, that can be sent to tackle targets at long range and can defend itself from enemy aircraft. Like X-47B it has two in-built weapons bays, though is currently unarmed.

UN Warns Of Killer Robots, May Pose Threat To Peace And Should Be Banned

Apart from drones, several states are known to be actively exploring the possibility of autonomous weapons operating on the ground. South Korea has set up sentry robots known as SGR-1 along the Demilitarized Zone with North Korea that can detect people entering the zone through heat and motion sensors; though the sentry is currently configured so that it has to be operated by a human, it is reported to have an automatic mode, which, if deployed, would allow it to fire independently on intruders.

Steve Goose, Human Rights Watch's arms director, said it was undeniable that "modern militaries are looking to develop autonomous weapons. The question is how far that push for autonomy will go."

Given its dominance as the world's leading military power, the US is likely to set the pace. According to Human Rights Watch, the Pentagon is spending about $6bn a year on research and development of unmanned systems, though in a directive adopted last November it said that fully autonomous weapons could only be used "to apply non-lethal, non-kinetic force, such as some forms of electronic attack".

The key issue identified by Heyns in his UN submission is whether future weapons systems will be allowed to make the decision to kill autonomously, without human intervention. In military jargon, there are those unmanned weapons where "humans are in the loop" – ie retain control over the weapon and ultimately pull the trigger – as opposed to the future potential for autonomous weapons where humans are "out of the loop" and the decision to attack is taken by the robot itself.

The possibility of "out of the loop" weapons raises a plethora of moral and legal issues, Heyns says. Most worryingly, it could lead to increasing distance between those carrying out the attack and their targets: "In addition to being physically removed from the kinetic action, humans would also become more detached from decisions to kill – and their execution."

SOURCE

Viewing all articles
Browse latest Browse all 4246

Trending Articles



<script src="https://jsc.adskeeper.com/r/s/rssing.com.1596347.js" async> </script>