Member States to the Convention of Conventional Weapons will decide tomorrow whether to consider autonomous weapons systems, or killer robots, under the current treaty.
Ahead of what could be a landmark decision towards a moratorium against the development of these weapons, Professor Noel Sharkey, chair of ICRAC and founding member of the Campaign to Stop Killer Robots, discusses autonomous weapons systems on BBC Radio 4’s Today Programme.
Today: Can you start with a little bit of a lesson what that [referring to the Northrop Grumman X-47B unmanned combat air vehicle] is and why it is different from a drone?
Sharkey: That’s quite different from a drone because a drone is remotely controlled from a few thousand mile away. It can be controlled from anywhere, really. There are two people sitting at a big computer screen with joy sticks, like a video game, moving it around and looking through camera eyes.
In the case of the X-47B, there is no controller. There is no pilot. It flies on its own under programmed guidance and instructions from a computer.
Today: But isn’t that programming effectively the same as having somebody in control, it’s just that they’ve done it in advance?
Sharkey: Well it’s not so flexible. Humans are watching what’s going on. In terms of the flying of the machine, for me as a roboticist, that is one incredible achievement. It’s a brilliant piece of technology.
The Campaign to Stop Killer Robots, of which (ICRAC) are a founding member, is not against the idea of autonomous robots. My vacuum cleaner is autonomous for instance and I don’t want to get rid of that. I’ve been working on (autonomous robots) for thirty years.
What we are interested in getting rid off is the kill function, the ability of a robot weapon, an autonomous weapon, that can be on the ground – they’re developing them for the ground, they’re developing them for the sea, they’re developing them for submarines.
What we don’t want is for the machine to be delegated with the decision to kill someone. In other words, the machine finds the target itself and attacks it without human involvement, without human intervention.
Today: Because your concern is that there is nobody at some point who can say that that is wrong, that at a particular point there is a mistake about to be made?
Sharkey: Yes. It is mishaps. I mean, no machine can discriminate between a civilian target and a non-civilian target for instance.
Automatic target recognition, which missiles use, can use some profiling to see a tank or recognise a ship, provided they are in an uncluttered environment. For instance, a truck among trees with sticking out branches could easily be mistaken for a tank.
The other thing that it uses is heat signatures so that it can detect moving bodies. This couldn’t tell if that thing was a child or an insurgent or a soldier, really.
Did you find this story interesting? Please support AOAV's work and donate.