Categories

AOAV: all our reports

Are UK universities being supported by defence sector to evolve killer robots?

On the 20th of September of this year, a report was published by “UK Campaign to Stop Killer Robots”, a network of UK-based organisations working towards the establishment of international legal boundaries regarding the use and development of autonomous weapons systems (AWS). The paper was entitled “An Investigation Into the Role of UK Universities in the Development of Autonomous Weapons Systems.”

It found “at least 65 recent and ongoing projects within the realms of sensor technology, AI, robotics, mathematical modelling and human-machine pairing” within 13 UK-based academic institutions monitored during the study.[1]

Each of these universities specialised in the technical fields, receives significant funding from the Ministry of Defence (MoD) and maintains established ties with military or arms manufacturers.[2]

Autonomous Weapons Systems (AWS)

“Autonomous weapons systems (AWS) are the range of weapons systems that detect and apply force to a target based on sensor inputs. The specific object that is struck by an autonomous weapons system, and the time and place of this, are determined by sensor processing: following activation, there is a period of time where such systems can apply force to a target without direct human approval.”[3]

“An Investigation Into the Role of UK Universities in the Development of Autonomous Weapons Systems”

Omitting the factor of human control and judgement in the use of weapons increases risks associated with conflict escalation, desensitises people to the use of force and perpetuates already-existing algorithmic biases which disproportionally impact margininalised communities.[4][5]

Beyond these aforementioned reasons, further autonomisation of weapons systems “have implications […] [that] negatively shape our relationship with autonomy in decision making across all areas of society.”[6] The aforementioned findings highlight the importance of building robust ethical and legal guidelines for funding and conducting research in the field of AWS-relevant projects and technologies.

“Tech like facial recognition favours light-skinned and outwardly masculine faces over darker-skinned and outwardly feminine faces. And, while efforts will be made to diversify data sets, this is not just a case of unrepresentative data. A.I. technologies are reinforcing existing institutional patterns of discrimination. Stereotypes are entrenched by automated decision-making.”[7]

StopKillerRobots.org

Projects’ Ranking System

Projects linked to the academic institutions investigated in this study were each assigned a red-amber-green-grey rating system “which assesses the risk that findings from the project may in the future contribute as a ‘building block’ to the development of [AWS].”[8]

The following table lists the criteria for each category:[9]

HigherThe purpose of the research project is to contribute to i) the development of weaponry or
ii) targeting capabilities as part of the ISTAR [intelligence, surveillance, target acquisition, and reconnaissance] function, or iii) militarily applicable tactics such as swarming. The project has the potential to increase the degree of automation in weapons systems. The project is funded through an agency of the Ministry of Defence or by a contractor with a track record of developing military autonomous technology. Example: A project to improve sensor performance funded by a multinational defence contractor.
MediumResearch with dual-use potential into robotics, computer sciences, sensor technology or a similar field of research which underpins the development of autonomous technologies or might contribute to legitimising the future use of autonomous weapons systems in warfare. There is potential that the research findings may contribute to the future development of autonomous weapons systems in the absence of any control measures to prevent this. Example: A project to develop software for use in the development of self-driving vehicles.
LowerNo immediately apparent military or dual-use applications in the development of autonomous technologies. Example: A project to develop a regulatory framework for the future application of autonomous technologies.
Insufficient informationInsufficient information is available in the public domain to allow the project to be assigned to one of the above categories. Example: A project for which only a one sentence description is available in the public domain, with no further supporting information or web-link to a full project description.
The 13 academic institutions that were studied were: The Alan Turing Institute, Cranfield University, Imperial College London, University College London, University of Birmingham, University of Bristol, University of Cambridge, University of Edinburgh, University of Manchester, University of Oxford, University of Southampton, University of Strathclyde, University of Warwick

Of the 65 relevant projects identified by the study, 17 were found to pose a “higher risk of use in the development of [AWS].”[10]

The lack of publicly-available information regarding project details and funding is showcased by one example highlighted in the report of the “Flexible Autonomy for Swarm Robotics” project initiated by the Alan Turing Institute.

The unknown factors behind the program are the date of its inception, the date of its conclusion, details of its funding and information about the parties with whom the research is affiliated. The opacity of accessible data is inappropriate given the significance of the project’s aims in the field of AWS-relevant research, which are to “ensure that human operators […] understand the automated actions taken by the swarms,” and to “enable deployments of up to 100 robots with minimal human oversight.”[11]

The Academia-Industry-Government Nexus

Although the UK government maintains that “it does not possess fully [AWS] and has no intentions of developing them,” and that “when deploying [AWS], it will always ensure meaningful […] human involvement [throughout the process],” the MoD remains persistent in its funding and support of research work in computer science, robotics, and sensors– the “three […] key disciplines underpinning [AWS] technology.”[12] The close relationship that has been fostered between publicly-funded UK universities and the defence sector should raise concerns about the low-efficacy of ethics frameworks that overlook academic research and the absence of clear-cut policies that would serve to regulate governmental involvement in such affairs. Across the targeted institutions, this investigation found “no reference to AWS or military research in […] ethics policy, at neither the university nor the department level.”[13]

Within the context of military application, oversights regarding ethical permissibility have the potential to yield disastrous humanitarian consequences when considering the “credible risk” attached to the use of such research for the development and use of AWS.[14] Furthermore, the lack of transparency with regards to “exactly what money goes where” constricts the public’s ability to hold accountable the three-pronged military-research-funding nexus comprised of academia, industry, and government.[15]

In some of the institutions investigated, the development of technology that could advance the development of AWS is carried out via spinout technology companies created by university academics, which directly develops AWS-relevant technologies in partnership with defence bodies.”[16] This activity is also led by students within the selected target institutions, and the ethics modules underpinning such operations are loosely defined or not strictly enforced.

“An Investigation Into the Role of UK Universities in the Development of Autonomous Weapons Systems”
A robot mascot for the Campaign to Stop Killer Robots in London April 2013.  (Photo credit: Carl Court/AFP/Getty Images)

In 2020, the UK government announced “a substantial increase in military spending” following the publication of the Integrated Review, which specified research and development (R&D) and science and technology (S&T) as sectors of significant interest for future investment. Following the subsequent increase of the R&D budget by 100%, the “post-Brexit” conception of the UK seemingly aims to strengthen the hold of the established nexus and further undercuts non-military representation in higher academic institutions.[17]

“Over-representation of military interests in the university sector is problematic because it distorts focus away from humanitarian aims and facilitates the use of force in addressing global conflicts.”[18]

“An Investigation Into the Role of UK Universities in the Development of Autonomous Weapons Systems”

Integrating the military industry within public institutions offers an effective way for arms producers to improve their public image amid growing awareness of the indiscriminate harm and human rights violations resulting from UK arms exports. At the same time, the rising presence of military actors and interests in universities make these sites increasingly militarised spaces, further endangering the notion of higher education as “a public good and an autonomous sphere for the development of a critical and productive citizenry.”[19]

“An Investigation Into the Role of UK Universities in the Development of Autonomous Weapons Systems”

Lack of Transparency: Specific Concerns

Researchers involved in the writing of the aforementioned study explicitly state these key concerns regarding the transparency of AWS-related research:[20]

  • In the lack of information made publicly available by institutions regarding funding received through defence partnerships;
  • “In lack of information surrounding ethical decision-making arrangements on research funding at the universities’ highest ethical boards;
  • In the manner in which the universities engaged with information requests in the scope of our investigation and campaigning activities. No freedom of information (FOI) request issued by our investigation was answered in full, and the responses received indicate a general unwillingness to effectively address the need for safeguarding policies.

Varying levels of transparency were displayed by the target institutions in the publishing of “titles, dates and descriptions of projects,” as well as a discrepancy in the sharing of funding-related information from different funding sources. For example, projects financed by the Engineering and Physical Research Council and other governmental and non-governmental innovation-based organisations cited their sources more generously, while projects which received funding from the MoD (“through programs such as the Defence and Security Accelerator [DASA]”) were published with far less detail.[21]

The UK Campaign to Stop Killer Robots Encourages All UK Universities to:

  • Sign the Future of Life Pledge calling for strong international norms, regulations and laws against lethal autonomous weapons;
    Make a stand-alone pledge to establish mechanisms to minimise the risks that university dual-use research could be applied for unintended malicious uses or incorporated in harmful weapons systems, such as autonomous weapons systems. Promote the pledge and raise awareness of it amongst university students, researchers, staff, and industrial partners;
  • Incorporate a specific policy to assess the risks of dual-use research using AI and autonomous technology into a university-wide ethical framework (and within the framework of relevant faculties) to help guide ethical decision-making on research funding and activities;
  • Regularly and transparently report on any ongoing research programmes for which particular vigilance is required to ensure that university-developed technology is not applied in harmful a manner, such as in violation of International Human Rights Law and International Humanitarian Law;
  • Increase the transparency of universities’ funding sources and research projects, particularly those in controversial areas such as partnerships with defence bodies, for instance by creating a public, user-friendly and regularly updated database;
  • Provide expertise to – and engage with – relevant stakeholders to facilitate the development of national and international legislation to restrict and regulate the development and use of autonomous weapons systems. Expertise would be particularly valuable with regards to developing legal provisions to prevent dual-use technology built for peaceful purposes being incorporated into devices with harmful applications. The UK Campaign to Stop Killer Robots advocates for an international legal treaty that firstly, prohibits the development and use of autonomous weapons that cannotbe meaningfully controlled and those that target humans, and, secondly, regulates other autonomous weapons to ensure meaningful human control over the use of force.[22]

Sources

[1] Griffiths et al. An Investigation Into the Role of UK Universities in the Development of Autonomous Weapons Systems (Page 10)

[2] Griffiths et al. An Investigation Into the Role of UK Universities in the Development of Autonomous Weapons Systems (Page 7)

[3] Griffiths et al. An Investigation Into the Role of UK Universities in the Development of Autonomous Weapons Systems (Page 8)

[4] Griffiths et al. An Investigation Into the Role of UK Universities in the Development of Autonomous Weapons Systems (Page 5)

[5] Stop Killer Robots: A Guide for Policy Makers (Page 4)

[6] Stop Killer Robots: A Guide for Policy Makers (Page 6)

[7] StopKillerRobots.org (Front page)

[8] Griffiths et al. An Investigation Into the Role of UK Universities in the Development of Autonomous Weapons Systems (Page 8)

[9] Griffiths et al. An Investigation Into the Role of UK Universities in the Development of Autonomous Weapons Systems (Page 9)

[10] Griffiths et al. An Investigation Into the Role of UK Universities in the Development of Autonomous Weapons Systems (Page 10)

[11] Griffiths et al. An Investigation Into the Role of UK Universities in the Development of Autonomous Weapons Systems (Page 29)

[12] Griffiths et al. An Investigation Into the Role of UK Universities in the Development of Autonomous Weapons Systems (Page 6)

[13] Griffiths et al. An Investigation Into the Role of UK Universities in the Development of Autonomous Weapons Systems (Page 11)

[14] Griffiths et al. An Investigation Into the Role of UK Universities in the Development of Autonomous Weapons Systems (Page 4)

[15] Griffiths et al. An Investigation Into the Role of UK Universities in the Development of Autonomous Weapons Systems (Page 14)

[16] Griffiths et al. An Investigation Into the Role of UK Universities in the Development of Autonomous Weapons Systems (Page 10)

[17] Griffiths et al. An Investigation Into the Role of UK Universities in the Development of Autonomous Weapons Systems (Page 12)

[18] Griffiths et al. An Investigation Into the Role of UK Universities in the Development of Autonomous Weapons Systems (Page 12)

[19] Griffiths et al. An Investigation Into the Role of UK Universities in the Development of Autonomous Weapons Systems (Page 13)

[20] Griffiths et al. An Investigation Into the Role of UK Universities in the Development of Autonomous Weapons Systems (Page 11)

[21] Griffiths et al. An Investigation Into the Role of UK Universities in the Development of Autonomous Weapons Systems (Page 19)

[22] Griffiths et al. An Investigation Into the Role of UK Universities in the Development of Autonomous Weapons Systems (Page 5)