In a recent event hosted by the All-Party Parliamentary Groups on the UN and Drones, Ben Emmerson, UN Special Rapporteur on human rights and counter-terrorism, once again warned of the inherent risk of the use of armed drones and the rapid development of autonomous weapons systems.
The meeting with UN Special Rapporteur Ben Emmerson and Professor Michael Clarke, Director-General of RUSI, focused on Emmerson’s ongoing inquiry into the use of armed drones and their impact on civilians. He particularly voiced concerns over the lack of accountability of the CIA’s strikes in Pakistan’s Federally Administered Tribal Areas.
Discussing the contrasting practices of the UK and US governments, both agreed that while UK drones are only employed in the Afghan war, it is likely that intelligence gathered by UK drones is used to support US strikes – even outside the remits of armed conflict.
Clarke said: “We share information and it’s very hard to say that it is not used to target individuals. There’s a reasonable presumption that sharing information makes us complicit in the US policy…the UK silence [on the UK’s role in the US drone programme] is deafening.”
The debate also highlighted the implications of greater autonomy enabled by advances in technology. Emmerson and Clarke stressed a need for further international regulation that may be required for drones to comply with the the laws of armed conflict.
Emmerson warned of the prospective threat of delegating kill decisions to machines, inherent with the development of autonomous weapons systems. He said:
We are sitting on the cusp of the next generation, and the next generation of RPA needs to be able to defend itself against attack which means it needs to be armed with the technology to make a decision whether to respond to an attack. Because of the inevitable time lag between Creech Air Force Base in Nevada or RAF Waddington in the UK and the field of conflict – it’s only about a second or two but it’s too long for the sort of decisions that need to be made in self-defence in a moment of agony in conflict – it follows that some of those decisions are going to need to be delegated to the machine itself.
There doesn’t seem to be any serious suggestions from any state that it is even contemplating allowing a machine to make a decision to kill a human being, but the distinctions are quite difficult to draw once an RPA is defending itself against attack because if it’s making the decision for itself, then obviously there are risks that are associated with that.
In a report prepared earlier this year for the UN, Emmerson has criticised the practices in the use of armed drones, the international secrecy surrounding their use and the “accountability vacuum” drones operate in.
An audio recording of the full event is available here.
Did you find this story interesting? Please support AOAV's work and donate.