Source: The New Internationalist
Irrespective of intelligence, no machine is capable of morality. So, if you thought drones were bad, you are likely to take an even dimmer view of their successors: ‘lethal autonomous robotic weapons’. Unlike remote-controlled drones, ‘killer robots’ require no external ‘live’ human input at all, and can be pre-programmed to select and destroy specific targets.
The new weaponry poses a grave threat to human rights, according to the Campaign to Stop Killer Robots, which argues that the arms undermine international law by eliminating all human culpability.
In the tradition of militarized drones, developed by contractors behind closed doors and unleashed, almost without warning, on to the battlefield, lethal autonomous weapons could be put into use in combat without further public debate.
It’s a trend that concerns human rights groups and military organizations in equal measure. The latter consider it a slight on military practice to suggest that wars should not be fought by trained individuals, acting under certain codes.
The ‘killer robot’ technology is being developed in the US, Britain, Russia, China and Israel. Israel already has the ‘Harpy’ – a ‘fire and forget’ weapon capable of detecting and destroying radar emitters.
‘If this is coupled with greater autonomy of movement and operation,’ explains Laura Boillot of Article36, a not-for-profit working to prevent unacceptable harm caused by weapons, ‘we will start to see fully autonomous weapons in combat.’
So when will governments discuss putting controls on fully autonomous weapons?
The UN Human Rights Council hosted its first debate on the ethics of these weapons last May. Britain opposed a moratorium on development of the arms – the only state out of 24 in attendance to do so.
‘A couple of states have recommended that this issue be discussed at the next meeting of the Convention on Certain Conventional Weapons (CCW) in November,’ says Boillot. ‘Around 100 states are party to the treaty, which managed to ban blinding lasers, comparable to killer robots in that they were banned before coming into use. But on the whole, it is not famous for ambitious, standard-setting results.’
Even if fully autonomous weapons are blocked by the CCW, the technology now exists. In the long run, it will become increasingly difficult to govern.
Irrespective of intelligence, no machine is capable of morality. So, if you thought drones were bad, you are likely to take an even dimmer view of their successors: ‘lethal autonomous robotic weapons’. Unlike remote-controlled drones, ‘killer robots’ require no external ‘live’ human input at all, and can be pre-programmed to select and destroy specific targets.
The new weaponry poses a grave threat to human rights, according to the Campaign to Stop Killer Robots, which argues that the arms undermine international law by eliminating all human culpability.
In the tradition of militarized drones, developed by contractors behind closed doors and unleashed, almost without warning, on to the battlefield, lethal autonomous weapons could be put into use in combat without further public debate.
It’s a trend that concerns human rights groups and military organizations in equal measure. The latter consider it a slight on military practice to suggest that wars should not be fought by trained individuals, acting under certain codes.
The ‘killer robot’ technology is being developed in the US, Britain, Russia, China and Israel. Israel already has the ‘Harpy’ – a ‘fire and forget’ weapon capable of detecting and destroying radar emitters.
‘If this is coupled with greater autonomy of movement and operation,’ explains Laura Boillot of Article36, a not-for-profit working to prevent unacceptable harm caused by weapons, ‘we will start to see fully autonomous weapons in combat.’
So when will governments discuss putting controls on fully autonomous weapons?
The UN Human Rights Council hosted its first debate on the ethics of these weapons last May. Britain opposed a moratorium on development of the arms – the only state out of 24 in attendance to do so.
‘A couple of states have recommended that this issue be discussed at the next meeting of the Convention on Certain Conventional Weapons (CCW) in November,’ says Boillot. ‘Around 100 states are party to the treaty, which managed to ban blinding lasers, comparable to killer robots in that they were banned before coming into use. But on the whole, it is not famous for ambitious, standard-setting results.’
Even if fully autonomous weapons are blocked by the CCW, the technology now exists. In the long run, it will become increasingly difficult to govern.
– See more at: http://newint.org/sections/agenda/2013/10/01/killer-robots/#sthash.eHIIPKUa.dpuf