Time for an international convention on robotic weapons

The estimable Professor Noel Sharkey is calling today for a debate on the use of robotic weapon systems, like the UAVs that I have been covering sporadically. He’s right of course, but we need to go much further and much faster. With increasing numbers of counties turning to remote-controlled weapon systems, and the potential deployment of more independent or even ‘intelligent’ robots in war, hat we need is an international convention which will limit the development and use of automated weapon systems, with clear guidelines on what lines are not to be crossed. In the case of atomic, biological and chemical weapons these kinds of conventions have had mixed success, but we have had very few clear examples of the use of such weapons in international conflicts.

India invests in surveillance drones

According to The Times of India, the Indian military is investing massively in boom military industry of the moment – Unmanned Aerial Vehicles (UAVs or drones).

An IAL Heron TP UAV in flight
An IAL Heron TP UAV in flight

The initial order is apparently for coastal protection and involves the purchase of Heron UAVs from Israel Aerospace Industries, a specialist in such technologies which produces everything from large payload drones to tiny micro-UAVs like the Mosquito, which can be launched by hand and is designed for “providing real-time imagery data in restricted urban areas.” The Indian Defence Research and Development Organisation (DRDO) and Aeronautical Development Establishment (ADE) have also been developing their own drones in conjunction with IAI, the latest being the Rustom MALE.

A Predator UAV equiped with Hellfire missile (USAF)
A Predator UAV equiped with Hellfire missile (USAF)

Herons are supposedly unarmed but armed versions were used in the 2006 invasion of Lebanon by Israeli armed forces. The ToI article also makes it clear that Indian forces will be buying more overtly aggressive drones such as the US Predator systems that have been used to such devastating effect against Al-Qaeda and the Taliban in the Pakistan-Afghanistan frontier regions. Far from easing up on the use of these remote-control killing machines, Obama’s administration has accelerated their use. They put fewer US troops in the firing line, and can attack remote areas, from where it is also very difficult to get an accurate independent view on their activities. However they are alleged to have been massively inaccurate, with the Pakistan government claiming that only 10 out of 60 missions between January 2006 and April 2009 had hit their targets, killing 14 Al-Qaeda leaders and 687 civilians, an appalling ratio.

With the advent of strategic bombing and then the ICBM, the Twentieth Century saw a massive increase of the role of remote surveillance in warfare, which was intimately linked to the growth in destructive power and the ability to not to understand the consequences in any direct or emotional way. Even with the tank and artillery ground warfare was not so remote, but now in the Twenty-first Century we are seeing surveillance-based, remote-control warfare becoming increasingly normalised. It is not surprising to see both hypocritical states like the USA and Israel intimately involved in the promotion of this form of conflict which looks cleaner and more ‘moral’ from the point of view of the user, but which in fact simply further isolates them from the consequences of their action. Real time surveillance turns everyday life in to a simulation, and drone-based warfare makes war into something like a game. And it’s a deadly and amoral game that increasing numbers of states, like India, are now playing.

Datawars Conference

There will be a very interesting -looking conference in Amsterdam, 11-12 June, called Datawars: Fighting Terrorism through Data. According to the call for papers, the workshop will be held at the University of Amsterdam in June and will explore the ethical and political implications of the new data-led approach to security, risk and fighting terrorism in Europe. Suggested topics include:

  • Privacy, security and human rights
  • Ethics, responsibility and justice in European data wars
  • Risk, prevention, preemption
  • Data and surveillance
  • Private authorities, states and the European Union
  • Constituting Europe through data

It´s part of a project run by a couple of excellent researchers, Louise Amoore and Marieke de Goede, of the Universities of Durham and Amsterdam respectively (who probably don´t remember but I worked in an tiny attic office opposite them in the Politics Dept at Newcastle for a few months just after my PhD!). I might go as I have been doing some work on attempts to create global databases, called ´From Echelon to Server in the Sky´, but the timing might be awkward (unfortunately I can´t reveal why yet…).

The robots are coming and now they’re angry…

A mindless drone robot is one thing but an independent robot with a tiny mind capable only of death and destruction – that is something else entirely.

Whilst I was doing my PhD in the late 90s, I met a guy called Steve Wright who used to run the Omega Foundation (who were like the ‘Lone Gunman’ organisation from the X-Files, but for real), and who is now at the Praxis Centre at Manchester Metropolitan University, UK. He was investigating the development of new forms of automated killing and control systems and ever since then, I’ve been keeping an eye on the development of remote-controlled and increasingly automated surveillance technologies, and in particular the development of robotic devices that are able not only to collect or transfer data, but to respond physically.

Two stories this week reflect the variety of developments in all kinds of different arenas, and raise all sorts of issues around the distancing of human responsibility from material, in many cases, punitive or lethal action.

'Intruder' captured by web-slinging robot
Japanese security robot (AP)

The first was the news that Japanese technologists have produced a mobile remote-controlled robot that can fire a net over ‘intruders’. Until recent years such developments had been carried out largely in the area of military research by organisations like the RAND corporation in the USA. However, particularly since the end of the Cold War when military supply companies started to look to diversify and find new markets in a more uncertain time when the ‘military-industrial complex’ might no longer ensure their profits, there has been a gradual ‘securitization’ of civil life. One consequence of this has been that so-called ‘less-lethal’ weapons are increasing regarded as normal for use in law enforcement, private security organisations and even by individuals.

However a further change has been in the when these operations can be automated when an intermediary technology using sensors of some kind is placed between the person operating them and the person(s) or thing(s) being monitored. This removes the person from the consequences of their action and allows them to place the moral burden of action onto the machine. The operation is aided still more if the machine itself can be ‘humanized’, in the way that Japanese robots so often are. But a kawaii(cute) weaponized robot is still a weaponized robot.

A 'Predator' Drone (USAF)
A 'Predator' Drone (USAF)

In the skies above Afghanistan, Iraq and Gaza however, ‘cuteness’ doesn’t matter. Remote-control military machines have been stealthily entering the front lines, colonising the vertical battlespace, with lethal consequences that have not yet been considered enough. This week we saw US unmanned aircraft operated by the CIA kill a total of 21 people in Pakistan, one of the few aspects of Bush-era policy that new President Obama has not (yet) promised to change.

All of these machines are still under some kind of control from human operators, but several profoundly misguided scientists are trying to create systems that are more independent, even ‘intelligent’. This week, I read about Professor Mandyam Srinivasan of Queensland University in Australia who, at least according to a report in The Australian, thinks it is a great idea to give missiles brains like angry bees. A mindless drone robot is one thing but an independent robot with a tiny mind capable only of death and destruction – that is something else entirely. I can think of few things that are less in the spirit of social progress than this, but he’s hardly the only one thinking this way: there are billions of dollars being pumped into this kind of research around the world…