Killer Robots in Wartime: Could They Be More Deadly Than Humans?

The ever-greater use by the United States – the military and the CIA – of drone air strikes in Yemen, Somalia, Syria-Iraq, Afghanistan, Pakistan and elsewhere, has focused enough attention so that there will be a start of formal international attention given to the use of “killer robots” and the possibility of autonomous  artificial intelligence weapons. US drone strikes have become so routine and bureaucratic that they provoke only small protests within the USA, usually on the part of human rights groups protesting the killing of individuals without a trial. (1)

The possible wider use of drones or other robotic military material such as autonomous tanks and trucks are being worked on by governments, often with the cooperation of private industry.  The possibility of an “arms race” among technologically-advanced states such as the USA, Russia, China, South Korea and Israel is real, especially at a time when mutual trust is in short supply.  More than a wide-spread lack of trust, there are at least three flash points where tensions are such that an arms race concerning artificial intelligence weapons could lead to more generalized conflict: Korea with both China and South Korea with technological capacity and an unpredictable North Korea; the Syria-Iraq-ISIS-Kurds-Turkey conflicts with Israel having technological capacity; Ukraine with both Russia and the USA with technological capacities.

Contemporary armament dynamics tend to acquire a momentum of their own and to resist social control. Essentially, an arms race can become a race in technology. The rhythm of technological advances far outstrips the pace of arms control negotiations. New weapons reaching the production line make arms control agreements on older weapons systems obsolete.

For the moment, those concerned with arms control issues put the emphasis on human control of drones rather than a complete ban. There should always be a human who “pulls the trigger” even if they are far away. Robots should not be completely autonomous. There is a science fiction fear that robots might be even more deadly and with less conscience than humans. It is better not to test the hypothesis.

The most structured avenue of action for those of us concerned with the issue is “The Convention on Prohibitions on the Use of Certain Conventional Weapons which may be Deemed to be Excessively Injurious or to have Indiscriminate Effects .” There is a yearly, but only two-day review of the Convention, this year it is November 22-24, 2017.  There will be a meeting of government experts in August to prepare the November meetings.  Thus letters of concern on “killer robots” should be sent to Foreign Ministers prior to August.  There are 107 States which have ratified the Convention, any one of which could take a leading role. Thus the importance of contacting as many States as possible, most appropriately through their missions to the United Nations.  The negotiating Conference which led to the Convention was chaired by Ambassador Oluyemi Adeniji of Nigeria.

The Convention which came into force in 1981 owes much to US actions during the war in Vietnam.  The prohibitions or the restrictions on the use of weapons most discussed during the negotiations on the Convention in 1978 and 1979 due to non-governmental representatives concerned napalm and flechettes – an ancestor of cluster munitions later widely used. Flechettes were small metal arrows that a bomb would send off in all directions. They were so small that they were difficultly picked up by an X-ray machine in the rare cases that there was an X-ray machine anyplace close to the fighting in Vietnam.

I was part of the NGO representatives pushing the flechette issue. Our statement to the conference stressed that “We submit further that the development of the flechette is a particularly flagrant example of the abuse, for destructive purposes, of technology which should rather be directed to meeting the legitimate peaceful needs of mankind.” Although I hate to repeat myself, I can say the same thing today concerning “killer robots.”

Through NGO efforts and strongly supported by the Government of Sweden, we were able to get a Protocol to the Convention concerning fragments non-detectable by X-ray.  Our effort had begun in 1973 at the time of a working group on the International Committee of the Red Cross which was dealing with incendiaries and fragmentation weapons.

Although the United Nations has not yet been able to prevent armed conflicts either between states or in civil wars, there needs to be an effort to reduce the suffering that such conflicts cause. Today, as NGOs, we work for a coming together of concerns for humanitarian international law, human rights respect of the dignity of each person, and arms control.

There was a long road to travel between 1973 and 1981 when the Convention came into force and today when the spirit of the Convention needs to be upheld and applied. There was a need to remain in close contact with Government representatives, the International Committee of the Red Cross, and those NGOs on the front lines.  We face the same challenges today to regulate artificial intelligence lethal weapons.  While we know that humans can be destructive, we work in the knowledge that we can limit human destructiveness and turn real intelligence toward the common good.

Rene Wadlow is President of the Association of World Citizens and a representative to the UN, Geneva.

Note

(1) For a useful NGO analysis see “Killer robots and the concept of meaningful human control” Human Rights Watch, April 2016