Hostname: page-component-586b7cd67f-rcrh6 Total loading time: 0 Render date: 2024-11-22T20:28:48.342Z Has data issue: false hasContentIssue false

The risks of autonomous weapons: An analysis centred on the rights of persons with disabilities

Published online by Cambridge University Press:  07 November 2022

Rights & Permissions [Opens in a new window]

Abstract

Autonomous weapons systems have been the subject of heated debate since 2010, when Philip Alston, then Special Rapporteur on Extrajudicial, Summary, or Arbitrary Executions, brought the issue to the international spotlight in his interim report to the United Nations (UN) General Assembly 65th Session. Alston affirmed that “automated technologies are becoming increasingly sophisticated, and artificial intelligence reasoning and decision-making abilities are actively being researched and receive significant funding. States’ militaries and defence industry developers are working to develop ‘fully autonomous capability’, such that technological advances in artificial intelligence will enable unmanned aerial vehicles to make and execute complex decisions, including the identification of human targets and the ability to kill them.”1 Later, in 2013, Christof Heyns, who was Special Rapporteur for Extrajudicial, Summary or Arbitrary Executions at the time, published a report that elaborated further on the issues raised by what he called “lethal autonomous robotics”.2 Following a recommendation by Advisory Board on Disarmament Matters at the UN General Assembly 68th Session, the Convention on Prohibitions or Restrictions on the Use of Certain Conventional Weapons Which May Be Deemed to Be Excessively Injurious or to Have Indiscriminate Effects, as amended on 21 December 2021, started discussing autonomous weapons systems in 2014. Then, the Group of Governmental Experts on Emerging Technologies in the Area of Lethal Autonomous Weapons Systems (GGE on LAWS)3 was created in 2016 to focus on this issue.4 While the group has kept meeting since then, no clear steps have been taken yet towards a normative framework on autonomous weapons as of September 2022.

In all these years, persons with disabilities – including conflict survivors – have not been included in discussions, nor has the disability perspective been reflected in international debate on autonomous weapons. Only recently has there been any effort to consider the rights of persons with disabilities when examining ethical questions related to artificial intelligence (AI). In this article, we will examine how and why autonomous weapons have a disproportionate impact on persons with disabilities, because of the discrimination that results from a combination of factors such as bias in AI, bias in the military and the police, barriers to justice and humanitarian assistance in situations of armed conflict, and the lack of consultation and participation of persons with disabilities and their representative organizations on issues related to autonomy in weapons systems.

Type
Research Article
Copyright
Copyright © The Author(s), 2022. Published by Cambridge University Press on behalf of the ICRC.

Introduction

According to the International Committee of the Red Cross (ICRC),

[a]utonomous weapon systems select and apply force to targets without human intervention. After initial activation or launch by a person, an autonomous weapon system self-initiates or triggers a strike in response to information from the environment received through sensors and on the basis of a generalized ‘target profile’. This means that the user does not choose, or even know, the specific target(s) and the precise timing and/or location of the resulting application(s) of force.Footnote 5

The development and possible use of autonomous weapons has caused serious concern among various sectors of civil society (led by the Campaign to Stop Killer Robots), the ICRC, States committed to international humanitarian law (IHL) and human rights, academia, the scientific community, faith leaders, tech workers and others. These concerns result from different angles of analysis, including ethics, humanitarian perspectives, international security, technology and of course IHL and international human rights law.Footnote 6 While some efforts have been made to examine the disproportionate impact of these weapons on marginalized populationsFootnote 7 and in the global South,Footnote 8 only recently has any consideration been given to the disproportionate effect they would have on people with disabilities.Footnote 9

Indeed, in his 2021 report on the rights of persons with disabilities in armed conflict, the UN Special Rapporteur on the Rights of Persons with Disabilities, Gerard Quinn, stated that “[t]he future of warfare, which may increasingly rely on autonomous weapons systems driven by artificial intelligence and machine-learning, would seem to exponentially compound [the] difficulties” faced by persons with disabilities in situations of armed conflict.Footnote 10 Quinn clearly identified autonomous weapons systems as additional risks for persons with disabilities in situation of conflict, since they would significantly compound the difficulties that persons with disabilities already face due to the collapse of essential and support services, and to the lack of an inclusive humanitarian response. A few months later, Quinn further detailed this problem in his report on artificial intelligence (AI), stating that

the deployment and use of fully autonomous weapons systems, like other artificial intelligence systems, raises concerns as to the ability of weaponry directed by artificial intelligence to discriminate between combatants and non-combatants, and make the nuanced determination as to whether an assistive device qualifies a person with disabilities as a threat.Footnote 11

In this article we will examine what effects autonomous weapons would have on people with disabilities, based on an analysis of the discriminatory factors and barriers that such individuals already encounter. As we shall see, the possible use of such weapons must be considered not in isolation, but in the context of the structural discrimination that exists in various sectors and contexts related to autonomous weapons.

It is important to note that although the article focuses on persons with disabilities, the text adopts an intersectionalFootnote 12 approach that looks at how different identities and characteristics can result in “multiple discrimination” as defined by the Committee on the Rights of Persons with Disabilities: “a situation where several grounds operate and interact with each other at the same time in such a way that they are inseparable and thereby expose relevant individuals to unique types of disadvantage and discrimination”.Footnote 13

When analyzing autonomous weapons and their possible impact on persons with disabilities, it is fundamental to look at the wider contexts in which these weapons are being developed and would be used. As we will see, accepting autonomous weapons as legitimate means of warfare would mean reproducing and amplifying, exponentially, the existing biases in our societies against marginalized groups – risking the right to life and dignity – and rendering access to justice for victims even more difficult. As Acherson affirms, “autonomous weapon systems are not just material technologies. While they are that, they also need to be understood within the wider context of power and violence.”Footnote 14

To examine the possible impact of autonomous weapons on persons with disabilities, we shall start by showing how existing bias in applications of AI in the civilian sector has meant that negative effects – when they occur – have a much greater impact on historically marginalized populations than on the population in general. As a group of twenty researchers in AI and emerging technologies has pointed out, “[d]esigned in an unequal society, these systems can be used to reproduce those inequalities. Built with an emphasis on efficiency rather than dignity, they can do irreparable harm.”Footnote 15 In the case of autonomous weapons, this “harm” means nothing less than a threat to the right to life, with the resulting damage being death and injury. Autonomous weapons could have the same impact as other weapons, but as we shall see, their effects would be compounded by a disproportionate impact on people with disabilities and other historically marginalized groups. The first section of the article provides examples of bias based mostly on race, gender and the intersection between the two, because disability and persons with disabilities have, for the most part, been excluded from the discussions on AI and AI bias.Footnote 16

Secondly, we shall present a number of examples that show how the armed forces and the police have conducted operations that have had specific and disproportionate effects on people with disabilities and have led both to the deaths and to serious injuries. We will provide examples that demonstrate how persons with disabilities and with other intersecting identities and characteristics that have been historically marginalized are at a greater risk than the rest of the population, compounding the bias in AI.

Thirdly, we shall examine how remote warfare is already having a distinct impact on affected populations, what difficulties persons with disabilities face during conflicts, and the barriers to accessing justice and reparations. We will then analyze how autonomous weapons would compound the existing barriers and the disproportionate impact that persons with disabilities already face in armed conflict.

Finally, in the fourth section we shall see that failing to include and consult representative organizations of persons with disabilities during discussions on autonomous weapons, at the level of the Convention on Prohibitions or Restrictions on the Use of Certain Conventional Weapons Which May Be Deemed to Be Excessively Injurious or to Have Indiscriminate Effects (Convention on Conventional Weapons, CCW) and at the national level, excludes their perspectives and experience. This de facto discrimination, compounded by the exclusion of persons with disabilities from the development of AI,Footnote 17 is an example of the extent to which persons with disabilities are being denied an opportunity to contribute their experience and expertise and to share their concerns on all issues and in all areas, as is their right. In forums where decisions are taken on the legality of weapons, the arguments of militarized States continue to dominate the agenda, rather than the perspectives of those who suffer the most consequences: persons with disabilities and other groups that are historically most excluded. Were we to act from a human security perspective, those groups would most assuredly take centre stage, as they suffer disproportionately from the ravages of conflict.

Debates of this importance to humanity should not be taken without fully incorporating the perspectives of persons with disabilities. With this article, we hope to contribute to making this important gap visible, and to narrowing it.

Bias in AI and its relationship with the risks of autonomous weapons

The impact of bias in AI

We should start by recognizing that AI systems have had positive impacts in societies, especially when they have been deployed and implemented with due regard for participation, human rights, accessibility and gender equality. This is still far from being the norm, but when it happens, AI can contribute to social welfare and in particular to achieving sustainable development objectives.Footnote 18 The Special Rapporteur on the Rights of Persons with Disabilities has stated that the appropriate and responsible use of AI can promote progress regarding the rights of persons with disabilities in several areas, including employment, independence and education.Footnote 19

However, the same report pointed out that “artificial intelligence also poses acute challenges to the enjoyment of human rights. While many of those risks are shared with other groups, some are unique to persons with disabilities, or [those persons may] carry differentiated and disproportionate risks”.Footnote 20 Examples include use of AI by police, for crime prevention, in job interviews, when assessing eligibility for social protection programmes, in determining access to training, and in humanitarian situations, including armed conflict settings – especially with regard to autonomous weapons.

The UN Educational, Scientific and Cultural Organization (UNESCO) and its member States make a similar point in the Recommendation on the Ethics of Artificial Intelligence (the Recommendation) adopted in 2021. The Recommendation recognizes that AI systems carry new risks because of their potential to “reproduce and reinforce existing biases, and thus to exacerbate already existing forms of discrimination, prejudice and stereotyping”. Furthermore, the Recommendation affirms that “AI technologies … raise fundamental ethical concerns, for instance regarding the biases they can embed and exacerbate, potentially resulting in discrimination, inequality, digital divides, exclusion and threat to cultural, social and biological diversity and social or economic divides”. Footnote 21

Such risks and the negative impacts of AI have been widely documented nationally and internationally. Understanding these challenges is extremely relevant to the analysis of autonomous weapons systems because these are the kinds of problems that could be reproduced by the use of AI and emerging technologies in the military sector. For instance, a groundbreaking study by Buolamwini and Gebru in 2018 examined three commercial facial analysis algorithms and data sets and found that all classifiers performed best for lighter-skinned individuals and males, whereas they performed worst for darker-skinned females. While the maximum error rate of recognition for lighter-skinned males was 0.8%, the error rate was up to 34.7% for darker-skinned females.Footnote 22 This is just an example of error rate gaps on the basis of gender and skin colour in AI applications; let us imagine what this “error rate gap” would mean in the case of autonomous weapons systems, and who would be the most affected. As Whittaker et al. point out, AI systems, often marketed as making more objective decisions, have repeatedly produced biased and erroneous outputs – and “even when AI works as the designers intended, these systems are too often used in ways that serve the interests of those who already possess structural power, at the expense of those who don't”.Footnote 23

The absence of diversity throughout the life cycle of AI is clearly one cause of this bias. It is important to remember that most AI is created, designed and implemented by people who have grown up in societies that constantly reproduce systems which create inequalities in access to rights and opportunities, such as patriarchy, colonialism, racism, heteronormativity, cisnormativity and ableism.Footnote 24 Ableism is of particular relevance for this analysis, as defined by the Special Rapporteur on the Rights of Persons with Disabilities in 2019:

a value system that considers certain typical characteristics of the body and mind as essential for living a life of value, [as it is based] on strict standards of appearance, functioning and behaviour …. Ableism leads to social prejudice, discrimination against and oppression of persons with disabilities, as it informs legislation, policies, and practices.Footnote 25

AI applications reproduce and amplify those prejudices. Indeed, as Whittaker et al. note,

[i]n modeling the world through data, AI systems necessarily produce and reflect a normative vision of the world. … Versions of normalcy reflected in the cultures and logics of corporate and academic tech environments are encoded in data and design and amplified through AI systems.Footnote 26

Bias has already had a negative effect in different sectors, such as employment, education, social protection, health, justice and the right to live in dignity.Footnote 27 What emerges from these examples is a pattern of marginalized populations repeatedly facing the negative consequences of using AI and emerging technologies, coupled with a lack of legislation that serves to hamper accountability, remedy and reparations.

Furthermore, the failure to include and recognize the diversity of the population in the criteria for the design and implementation of AI means that the priorities of marginalized groups are not reflected in the objectives and needs for the use of such systems, precisely because these groups are not consulted in decision-making about what it is or is not acceptable to delegate to AI, and whether or not it is necessary to legislate in this area.

When considering bias in autonomous weapons, we must therefore remember that they are not developed in a neutral context. On the contrary, they should be considered as both the cause and the consequence of a social, economic and technological system that constantly reproduces stereotypes, bias, discrimination and disproportionate negative consequences for marginalized groups – and for the global South in general.

The possible impact of AI bias in autonomous weapons

Let us recall that the protection of the rights of persons with disabilities is clearly codified in international law.Footnote 28 IHL refers to the general protection of civilians, including persons with disabilities, during international armed conflicts.Footnote 29 Article 16 of Geneva Convention IV entitles the “wounded”, “sick” and “infirm” to be treated as “objects of particular protection” during such conflicts. Article 27 of the same treaty states that all protected persons shall be treated with the same consideration “without any adverse distinction based, in particular, on race, religion or political opinion”, and the Special Rapporteur on the Rights of Persons with Disabilities has stated that “[t]his prohibition of adverse distinction (discrimination) is capacious enough to encompass disability”.Footnote 30 Additionally, Additional Protocol I to the Geneva Conventions, in its Article 8, recognizes that “‘wounded’ and ‘sick’ means persons, whether military or civilians, who, because of trauma, disease or other physical or mental disorder or disability, are in need of medical assistance or care and who refrain from any act of hostility”.Footnote 31 Finally, as we consider that autonomous weapons would be a risk in international and non-international armed conflicts alike, it is also relevant to recall that protection of civilians – including persons with disabilities – is guaranteed by the rules of customary IHL.Footnote 32

On the other hand, international human rights law, under Article 11 of the Convention on the Rights of Persons with Disabilities, requires States Parties to take “all necessary measures to ensure the protection and safety of persons with disabilities in situations of risk, including situations of armed conflict, humanitarian emergencies and the occurrence of natural disasters”.Footnote 33

Given this legal framework, we need to ask ourselves: will increasing the autonomy of weapons systems enhance States’ compliance with their existing legal obligations, or will such weapons constitute yet another obstacle to the exercise of those rights? As autonomous weapons have never been used on a large scale (and let us hope they never will be), the related bias has not been documented, but we believe it would be intellectually dishonest to claim that the same bias which has been thoroughly documented in the civilian sector, in different regions and contexts, would not apply to military applications and situations of conflict. Vanina Martínez, a researcher at Argentina's National Council for Scientific and Technical Research and the University of Buenos Aires, highlights two distinct aspects of the bias that could be found in these weapons: that linked to different groups of the population (as mentioned in the previous section), stemming from the use of data that reflects existing prejudices; and that linked to the context, in the sense that AI systems can only be based on part of the real world and cannot take account of every possible scenario, especially in the unpredictable situation of an armed conflict.Footnote 34

As mentioned above, the issue of how these weapons would affect persons with disabilities was raised in 2021 by the Special Rapporteur on the Rights of Persons with Disabilities, who specifically questioned the ability of weaponry directed by AI to “make the nuanced determination”, for instance, “as to whether an assistive device qualifies a person with disabilities as a threat”.Footnote 35 Given that persons with disabilities constantly face ableism, as has been defined above, it is highly probable that those same prejudices would be reflected in autonomous weapons, which would certainly not take account of the following considerations:Footnote 36

  • A person may use a wheelchair, walking stick, walker or crutches to move around, making their speed, height, and ability to react and move different from that of the rest of the population.

  • Not everyone communicates orally. It is impossible for a person who is deaf or hard of hearing to comply with an audible command or warning, or simply to look for refuge when the sounds of an attack can be the first sign of danger for others.

  • Blind persons and those with a visual impairment cannot make use of visual cues which may be given by autonomous weapons systems. They face barriers to mobility, concealment or even life-saving measures in the event of an attack, and would possibly need someone to explain the presence of autonomous weapons and/or what could be required of them in order to be safe in those weapons’ presence. Additionally, persons with daltonism may misinterpret light signals from such weapons or related systems.

  • Not everyone perceives or understands the world in the same way. For a person with an intellectual impairment, certain orders will be difficult to understand or obey. This condition may lead to additional stress when in the presence of an attack by autonomous weapons and thereby lead to greater trauma than that which would be experienced by the rest of the population.

  • People with psychosocial impairments might exhibit “unexpected” behaviour that autonomous weapons could be unable to process (such as lack of response, shouting or unexpected movements) or that such weapons might interpret as a risk, causing them to identify the person as a target.

  • Facial, iris or fingerprint recognition may not identify persons with characteristics such as eye deviation, inability to keep the head straight, or various skin conditions.Footnote 37

  • Devices or processes that currently use fingerprints are already causing situations of exclusion, since they are designed on the assumption that all persons have hands, fingers, fingerprints, and equal mobility in their arms and hands. For certain persons outside what is considered “the norm”, including persons with spasticity, taking fingerprints is practically impossible.

  • Similar issues exist with voice recognition: there are persons who do not communicate orally, who may require more time to express themselves or answer questions, or whose words may not come out as clearly as expected or comply with the required tonality to be considered “valid”.Footnote 38

These issues can become more complex when they occur simultaneously in persons with multiple disabilities, or when multiple systems that create inequality interact in an intersectional manner. For instance, in the case of an indigenous woman with a hearing impairment, AI applications in general – including those possibly embedded in autonomous weapons – would be incapable of processing the fact that she needs to communicate by signing, and to do so in her native language. As Moisés Vinacudo, an indigenous leader from the Murui Muina community in Colombia, explains:

As inanimate machines, autonomous weapons cannot understand or respect the value of life. Even though they might have the power to end life, they would not have the ability to understand what such a loss means nor what kind of impact this would have on the identities and realities of our communities.Footnote 39

It is important to note, as well, that these issues cannot be solved simply by including persons with disabilities in data sets. Whittaker et al. write:

The category of “disability” complicates pat classifications, and thus perturbs calls to simply include persons with disabilities in datasets, which are constructed around rigid models of categorization, however many categories they might include. … [T]he way in which “disability” resists fitting into neat arrangements points to bigger questions about how other identity categories, such as race, sexual orientation, and gender, are (mis)treated as essential, fixed classifications in the logics of AI systems.Footnote 40

Now, some readers may say that human beings also have those biases and reproduce them. While this is true, it must not be accepted as an excuse to also allow machines to reproduce, perpetuate and escalate such biases, and to do so in weapons, where what is at risk is the right to life. Furthermore, humans who reproduce these biases can be held accountable, but lack of accountability, as we will see in the next sections, remains one of the major challenges relating both to AI systems and to the military today. As Ricaurte points out, autonomous weapons intersect the two historically patriarchal systems of technology and the military, thus amplifying the negative impact not only on women, but also on marginalized populations.Footnote 41

Given the problems set out in this article, it is clearly essential that persons with disabilities and those belonging to historically marginalized groups participate both in AI development and in discussion forums concerning autonomous weapons. Their perspective and their analysis are essential if we are to avoid reproducing the systems of oppression that they face. This requires ensuring accessibility, funding for their participation and reasonable adjustments, which are desperately lacking. The case of autonomous weapons is representative, as even though the right to life and well-being of these individuals is at stake, they are not included in or invited to national or international debates.

To conclude this section, we would like to raise the issue that even when “ethical” perspectives on autonomous weapons systems are discussed, the debates generally do not include persons with disabilities or from other marginalized groups, and rarely include persons from the global South. Nevertheless, there is no one sole set of ethics. It is necessary to include in this debate ethical perspectives from different geographical regions and groups, including persons with disabilities, conflict victims, feminist organizations and others. To date, discussions regarding the ethics of autonomous weapons have continued to highlight and centre the voices of defence, diplomacy, the private sector and academia, which, for the most part, do not represent the views of persons with disabilities and other marginalized groups. No debate on the ethics of autonomous weapons can be considered serious if the voices of those who risk being most affected are not heard.

Military and police violence against persons with disabilities as a precedent relevant to the development of autonomous weapons

We have now seen how bias in AI and emerging technologies has a disproportionate effect on marginalized groups. This bias both reflects and reinforces discrimination.

Persons with disabilities, in particular, continue to suffer multiple types of discrimination, including a higher risk of death, which increases disproportionately for persons with disabilities during situations of disaster, conflict and armed violence.Footnote 42 In this section, we provide examples of military and police violence against persons with disabilities in different contexts, to show how bias from these institutions already impacts such persons disproportionately. Autonomous weapons would compound this violence through the bias in AI explained in the previous section.

Military violence against persons with disabilities

Persons with disabilities face specific and disproportionate risks in military operations; here we will share a few cases to illustrate this.

Let us start with an emblematic Colombian case. In a report on extrajudicial executions (known as “false positives”) committed by army personnel during Colombia's internal armed conflict, Bustamante reports that

[t]he worst form of this war crime was against persons with intellectual disabilities, whose condition was deliberately abused to facilitate the army's criminal actions. Lies were used to ‘conscript’ and execute them. They were seen as spoils of war, a means of obtaining perverse benefits.Footnote 43

A study on disability and armed conflict in Colombia by the Universidad de los Andes found that persons with disabilities killed in these executions included deaf persons, persons with intellectual disabilities, bipolar persons, persons living with epilepsy and persons living with osteoporosis.Footnote 44 The Special Jurisdiction for Peace, created to judge crimes during Colombia's armed conflict, states that a staggering 6,402 killings could have been “illegitimate deaths presented as persons killed in combat by agents of the State”.Footnote 45 The percentage of persons with disabilities is not mentioned, pointing to the lack of disaggregated data that is an additional difficulty in identifying human rights violations faced by persons with disabilities. By 2021, only eleven persons had been recognized as “penally responsible for war crimes” in Colombia.Footnote 46

A second case involves the disproportionate impact of conflict on persons with disabilities in Gaza. According to a report by Disability Representative Bodies Network, twenty-three persons with disabilities died and approximately fifty were injured during the Israeli operation Protective Edge.Footnote 47 The report affirms that one of the contributing factors was that warnings of bombings communicated by telephone or in the form of flyers dropped from aircraft and drones did not reach persons with disabilities to the same degree as persons without.

Thirdly, in its report on persons with disabilities during conflict, Human Rights Watch records the case of a 43-year-old man from northeast Cameroon with intellectual and hearing impairments who died when soldiers from the Rapid Intervention Battalion shot him because he did not answer their questions.Footnote 48 Many more such cases doubtlessly go undocumented in various theatres of conflict, as a result of the lack of transparency and accountability regarding military operations and the lack of identification of persons with disabilities as such, coupled with a failure to disaggregate casualties by disability. Nonetheless, these examples certainly point to specific risks faced by persons with disabilities during military operations.

Policy brutality against persons with disabilities

In addition to the use of violence by military forces, addressing police violence in the context of discussions on autonomous weapons is important because such weapons could find their way into the arsenals of police forces, which could increase human rights violations. As Wareham puts it, “[f]ully autonomous weapons systems need to be prohibited in all circumstances, including in armed conflict, law enforcement, and border control”.Footnote 49 The potential use of autonomous weapons in border control is another issue that, although outside of the scope of this article, should be of concern.Footnote 50

While discussions on autonomous weapons have so far been examined as they relate to situations governed by IHL, it is possible that if these weapons were to be developed, they would soon find their way into police arsenals. This has been the case with other weapons originally designed as military weapons that are now used by police.Footnote 51 Although the discussions on the legality of autonomous weapons systems at the Convention on Conventional Weapons are held in the context of IHL, for civil society the possible use of autonomous weapons in policing is of equal concern, as is the risk of such weapons making their way into the hands of illegal or non-State armed groups; these are both topics which would require further research.

For the moment, we shall mention only a few cases of how police brutality targets persons with disabilities, knowing that – as with similar cases related to the armed forces – there must be many more undocumented cases, forgotten by history and by justice, owing to the lack of transparency and accountability among many of these organizations. The following examples aim to show how, as in the military, human decision-making in policing is already biased against persons with disabilities, and such bias could be reflected in, and compounded by, the use of autonomous weapons.

According to a report by David M. Perry and Lawrence Carter-Long published by the Ruderman Family Foundation (a US organization for persons with disabilities), persons with disabilities make up between 30% and 50% of all individuals killed by police,Footnote 52 based on an analysis of police brutality in the United States between 2013 and 2015. The combination of skin colour and disability places some at higher risk – Haben Girma, a lawyer and activist who is black and deaf-blind, says, “Someone might be yelling for me to do something and I don't hear. And then they assume that I'm a threat.”Footnote 53

While cases of police brutality and use of excessive force against persons with disabilities have been recorded in the United States, that country does not, of course, have a monopoly on such practices. AP News reported in June 2021 that an autistic Palestinian man had been shot dead by Israeli police in Jerusalem allegedly because he did not respond as expected when they approached him. He was on his way to an educational institution when the incident occurred.Footnote 54 There must be many more cases around the world that have not been documented, especially those involving assaults on the physical or psychological well-being of deaf persons and those with intellectual and psychosocial impairments by police officers who expect immediate and standardized responses.

Why is it important to be aware of these instances of military and police violence with a disproportionate impact on persons with disabilities when we talk about autonomous weapons? Because delegating the critical functions of a weapon to autonomous systems – which would have tech-related biases – in the hands of organizations that already disproportionately kill persons with disabilities could only result in additional and specific risks for this group.

The intersection between military violence, police brutality and technological bias in a context of structural discrimination

War sets in motion certain political calculations and assumptions about which lives are worth preserving and which are not, about which lives are worth mourning and which are not, and about which are worth living and which are not. According to the US philosopher Judith Butler, in wars and situations influenced by war, “the apprehension of precariousness leads to a heightening of violence, an insight into the physical vulnerability of some set of others that incites the desire to destroy them”.Footnote 55

As pointed out by the Latin American Network of Persons with Disabilities and Survivors of Antipersonnel Mines and Explosive Remnants of War (RED-LAT)Footnote 56 in 2021,Footnote 57 the logic and dynamics of armed conflict exacerbate the strategies of control, domination, and militarization of life, as well as of bodies, community fabrics and the symbolic universes of individuals and communities. They also intensify violence against historically stigmatized people and groups, such as the Afro-descendant population, the LGBTQ+ community, indigenous peoples, ethnic groups, religious communities, peasant populations, women and girls, young people and persons with disabilities. RED-LAT states that “such identity dimensions end up becoming places of intersection of inequality, discrimination, displacement and exclusion, insofar as their conditions of precariousness are ontologically anchored to the body”.Footnote 58

From this perspective, use of excessive force by the military and police brutality may be attributed, at least in part, to individual, institutional and systemic prejudice against certain groups. It is no coincidence that the victims of police brutality and excessive military force in many countries are migrants, Afro-descendants, indigenous peoples, members of the LGBTQ+ community, women, activists, young people and persons with disabilities. Would not the criteria or algorithms on the basis of which autonomous weapons would make an attack reproduce and amplify human bias? Is it not likely that to an autonomous weapon, certain bodies, faces or reactions will appear more dangerous than others, based on social prejudices of which we already have examples both in the actions of certain military and police forces, and in AI applications?

Let us take a moment to draw a parallel with the issue of violence against girls and women in Latin America. According to the Inter-American Court of Human Rights, such violence is “rooted in concepts of the inferiority and subordination of women”.Footnote 59 The Committee on the Elimination of Discrimination against Women has also stated that gender-based violence, including killings, kidnappings and disappearances, should be seen not as a series of isolated or sporadic cases, but as the result of a structural situation, of social and cultural phenomena based on a culture of gender-based violence and discrimination.Footnote 60

Our analysis takes a similar approach: we are calling for autonomous weapons to be examined not in isolation, but rather in light of the systemic discrimination experienced by persons with disabilities and marginalized groups, especially during armed conflict. It is essential to recognize this context, in order to understand that any “mistakes” that such weapons may make will not be isolated cases, but will rather be the results of this conjunction of situations of structural discrimination that have, and will continue to have, a disproportionate effect on persons with disabilities and other marginalized groups, time and time again. We must not allow this. As an international community, we have the moral obligation to create new international legislation on these weapons, before they start to take lives on the basis of power asymmetries, ableism and other discriminatory systems such as racism, colonialism, heteronormativy and patriarchy.Footnote 61

The consequences of remote warfare and the barriers faced by persons with disabilities in conflict situations as a context for autonomous weapons

While autonomous weapons have not yet been used on a large scale, the need for a legal instrument has become even more urgent given that the autonomy of weapons is increasing at breakneck speed.Footnote 62 At the same, we are already witnessing the impacts of remote war in a context where persons with disabilities are already disproportionately impacted and face enormous barriers in situations of conflict. Let us examine this last point in more detail.

The impact of remote warfare

Some countries claim that it would be premature to negotiate a legally binding instrument on autonomous weapons because we do not yet know what consequences such weapons would have. While these weapons have (fortunately) not yet been used on a large scale, we can nonetheless analyze the current impact of remote warfare and draw some well-informed conclusions.

In the pages that follow, we will be using the concept of “remote warfare” as defined by the Centre for Global Challenges: “Remote warfare is a form of military intervention characterised by a shift away from boots on the ground and towards light-footprint military operations.”Footnote 63 According to Watson and McKay, “remote warfare refers to an approach used by states to counter threats at a distance. … ‘[R]emoteness’ comes from a country's military being one step removed from the frontline fighting.”Footnote 64

While the concept of remote war refers to a wider context, weapons with increasing autonomy are already used in remote warfare. As in earlier sections, we will share some examples of the impact of remote warfare to illustrate our point.

Let us examine first the human cost of the remote war in Yemen, as documented by Shiban and Molyneux.Footnote 65 According to the survivors of such attacks, the unpredictable and frequent appearance of drones is affecting the mental health of the population: people are already living in permanent fear of being attacked at any moment and are in a state of constant frustration and apathy that has driven some people to suicide. Not knowing when or where an attack will occur, or who the target will be, is having different effects on different groups, and those effects are aggravated for persons with intersecting marginalized identities and characteristics. For instance, the effects on mental health have been more severe in the case of young people. Mothers report that their young children are suffering from insomnia, depression, anxiety and fear. Children are no longer attending school for fear of attacks or are attending only because their families force them to do so. Women have reported that the incidence of miscarriage has increased owing to the stress of constantly feeling under threat.Footnote 66 While the experience of persons with disabilities is not specifically documented by Shiban and Molyneux (precisely because of the lack of a disability perspective on these and other issues), we can assume that remote war is at least as traumatic – and probably more traumatic – for these persons than for the rest of the population, particularly for persons with intellectual or psychosocial impairments, and persons with disabilities that also belong to other groups which face specific challenges, such as young persons, women and children.Footnote 67

Were the weapons used in these attacks fully autonomous? We cannot know. Nevertheless, the above-cited research demonstrates that an increase in distance – in terms of both time and space – between perpetrator and victim has a demonstrable and specific effect on victims, including harm to the mental health of affected populations. This is compounded by the fact that, in conflict settings, mental health services and support services for persons with disabilities – where they existed in the first place – are quickly broken down. It is therefore probable that the impact of autonomous weapons on mental health would be at least as severe as that of ongoing remote warfare. We have to consider these elements – we must not allow States to act as if they do not know what the consequences of the use of autonomous weapons will be.

Indeed, we believe the argument that negotiating a legally binding instrument on autonomous weapons would be premature (as claimed by Australia, South Korea, the United States, the United Kingdom and Russia, among othersFootnote 68) to be erroneous. It is perfectly possible to deduce the impact that such weapons would have from the known consequences of remote warfare. As Demmers says, “the term ‘remote warfare’ in itself sounds very clean and very controlled and distanced. … [W]ar has perhaps become distanced and sanitized for some, but remains brutal and intimate and physical to those at the receiving end of it.”Footnote 69

The impact of humanitarian crises and the lack of access to justice for persons with disabilities

The negative impact of remote warfare is compounded by two factors: the barriers faced by persons with disabilities in accessing humanitarian assistance and the barriers faced by victims of unlawful attacks in accessing justice and reparation mechanisms.

Firstly, we must remember that persons with disabilities face physical, attitudinal, legal, economic, communication and other types of barriers in every context. These barriers are exacerbated by, and their impact is even more severe in, humanitarian, conflict and post-conflict situations. In a study by Handicap International, 54% of respondents with disabilities said they had experienced a direct physical impact, sometimes causing new impairments. 27% reported that they had been psychologically, physically or sexually abused, and 38% had suffered negative effects on their mental health. Three quarters of persons with disabilities reported that they did not have adequate access to basic assistance such as water, shelter, food and health due to conflict.Footnote 70 Furthermore, according to an investigation by the Geneva Academy of International Humanitarian Law and Human Rights (Geneva Academy), persons with disabilities living in conflict zones are at higher risk of being institutionalized or being victims of selective killings, and may be used as human shields. Women and girls with disabilities face a heightened risk of sexual and gender-based violence.Footnote 71 The situation is particularly complex in rural and remote areas, which already have less access to services in general, particularly for indigenous groups.

Let us now examine the situation in another key area, that of access to justice – in particular, accountability, the right to remedy and obtaining reparations. According to Docherty,

a variety of legal obstacles make it likely that humans associated with the use or production of these weapons – notably operators and commanders, programmers, and manufacturers – would escape liability for the suffering caused by fully autonomous weapons. Neither criminal law nor civil law guarantees adequate accountability for individuals directly or indirectly involved in the use of fully autonomous weapons.Footnote 72

This is one of the great difficulties regarding accountability and access to justice, and one faced by all potential victims of these weapons.

Now let us consider the specific situation of persons with disabilities. According to the Geneva Academy, persons with disabilities are systematically denied access to justice when they have been victims of violations of IHL, and no attention is paid to ensuring that victims of conflict with disabilities are able to access and participate in judicial processes.Footnote 73

Let us examine this aspect – which is serious enough in itself – in light of what we are already seeing with remote warfare, through a specific example. In 2021, the US Air Force killed ten civilians in Kabul, including seven children and an aid worker.Footnote 74 What were the consequences? The Pentagon announced that the military personnel involved would not be disciplined.Footnote 75 The inspector-general of the US Air Force said an “honest mistake” had occurred as a result of errors of execution and communication problems,Footnote 76 while the Pentagon stated that it was looking at the possibility of making “condolence payments” to surviving family members.Footnote 77 Those family members are still waiting and continue to demand justice.Footnote 78 There are many more examples of such mistakes, and justice remains an illusion for the majority of victims.

What does all this tell us? It tells us that autonomous weapons would be developed and used in a context in which it is already the exception, rather than the norm, that victims are able to access justice – and this situation is even worse in the case of persons with disabilities. The characteristics of autonomous weapons, including those related to predictability and understandability,Footnote 79 would render accountability, remedy, reparations and, more generally, access to justice even more difficult for persons with disabilities – one of the groups hardest hit by conflict, with the greatest difficulties in obtaining justice. As Boulanin, Bruun and Goussac affirm, “autonomy opens up the possibility for IHL provisions to be exercised by a complex web of human and artificial agents, based on automated processes and in expanded and more complex geographical and temporal circumstances”, raising concerns that IHL violations cannot be “satisfactorily attributed, discerned, or scrutinized and, as a result, an individual or state responsible for an IHL violation is not held to account or punished for it”.Footnote 80 This concern would apply to any victim, but systemic discrimination would make it even worse for persons with disabilities.

This lack of accountability does not pertain exclusively to the use of AI and emerging technologies in the military sector. As Stauffer points out, the growing reliance on big-data analytics and algorithms to assist in, or even replace, predictive decision-making by humans “could come at profound cost in the years ahead by causing us to lose faith in our own ability to discern the truth and assign responsibility for bad decisions. Without someone to hold accountable, it is nearly impossible to vindicate human rights.”Footnote 81 Access to justice would be even more difficult if these technologies were used in weapons. As Docherty says in her analysis of the “accountability gap”, the obstacles to assigning responsibility would prevent those responsible from being held legally and morally accountable for unlawful killings and other harms.Footnote 82

We believe it is essential to consider these different angles of analysis when discussing autonomous weapons systems. This will allow us to understand the broader context in which such weapons would be used and developed, and to render visible how and why they would have a specific impact on the lives of persons with disabilities, exacerbating the already disproportionate impact of conflict on their human rights.

The exclusion of persons with disabilities from current debate on autonomous weapons

The Convention on the Rights of Persons with Disabilities, in its Article 4(3), clearly requires States Parties to closely consult with and actively involve persons with disabilities in decision-making processes, particularly through their representative organizations. We shall now look at why the exclusion of persons with disabilities in current discussions on autonomous weapons is of concern and results in continued disregard for the rights and perspectives of persons with disabilities, exacerbating their exclusion and their inability to influence decisions that will affect them disproportionately.

The contribution of persons with disabilities to the processes of humanitarian disarmament

Matters related to the military and the type and use of different weapons are usually considered to be of interest mainly to the armed forces, politicians and, in international forums, the diplomatic corps. As a result, disarmament and arms control have been discussed in these forums without most countries giving due weight to the experience, feelings and thinking of civil society organizations, in particular those of persons with disabilities, including survivors of indiscriminate weapons. Such weapons include anti-personnel mines and cluster munitions, the devastating effects of which still affect thousands of people, and nuclear weapons, which threaten the existence of the human race.

Working on the basis of their experience, civil society organizations including persons with disabilities have made technical contributions to the processes that resulted in the adoption and entry into force of several international instruments governing humanitarian disarmament, arms control and non proliferation. Their inputs and actions contributed to the adoptions of the treaties that prohibit the manufacture, use, stockpiling, export and import of anti-personnel minesFootnote 83 and cluster munitionsFootnote 84 for States Parties, and that require them to destroy their stockpiles. States Parties are also obliged to provide assistance to the victims of these weapons. More recently, a treaty has been adopted to prohibit the use of nuclear weapons, again with the participation and contributions of survivors of these horrific weapons.Footnote 85

The role of persons with disabilities, particularly survivors, in the negotiations that led to the adoption and entry into force of these instruments is perfectly clear for those who were part of those processes.Footnote 86 In the case of the negotiations on the Convention on Cluster Munitions, for instance, Borrie affirms that survivors had an “especially powerful effect on even the most hardened and cynical delegates”.Footnote 87 According to Human Rights Watch, “[s]urvivors not only provided heart-wrenching testimony that moved participants, but also skillfully lobbied for and gave interventions on specific legal provisions, such as a victim assistance obligationFootnote 88 and an absolute ban”.Footnote 89

As part of civil society, organizations of survivors of anti-personnel mines, cluster munitions and nuclear weaponsFootnote 90 – many of whom are persons with disabilities – have been an essential part of the processes that led to these important developments in IHL. Through their experience and knowledge, they demonstrated to the international community that such weapons should never have existed because of their indiscriminate effects on civilian populations.Footnote 91 This led the international community to reflect on and take a more comprehensive approach to addressing the unacceptable impact of these weapons and to address these issues from a humanitarian and human rights perspective.

So, on the one hand, persons with disabilities have the right to participate and bring their concerns and perspectives forward, and on the other, their contributions are fundamental because they bring different perspectives. Yet, their perspectives are still excluded from the discussions on autonomous weapons, particularly at the CCW.

The exclusion of persons with disabilities from discussions concerning autonomous weapons

As of September 2022, the CCW meetings which address the risks of autonomy in weaponry have not included interventions by, contributions from, or the participation of representative organizations of persons with disabilities (including those of survivors of different weapons), despite these persons having the right to have their perspectives considered, as explained in the previous section.

Now, some may argue that the case of autonomous weapons systems is different from that of anti-personnel mines, cluster munitions or nuclear weapons, since in those three cases there were already victims of such weapons when the prohibition treaties were negotiated. While this is true, the experience, priorities and needs of persons with disabilities and survivors of conflict is always relevant, on all topics and in all sectors, and such persons have the human right to be consulted. This is even more pressing in the case of autonomous weapons, since such weapons would probably have a disproportionate impact on persons with disabilities, for the reasons presented in the preceding sections.

Persons with disabilities have a right to participate in all these processes and to contribute to debates on what is acceptable and legal during war, such as those taking place on autonomous weapons systems at the CCW. Their experience would be extremely useful in current debates concerning new-generation weapons that might fail to recognize the characteristics of people who fall outside (wrongly) standardized human and social frameworks.

Apart from the fact that they have the right to be present, there are three other main reasons why organizations of persons with disabilities should be included in all debates on autonomous weapons. Firstly, these weapons could have a disproportionate impact among persons with disabilities. Such weapons, in fact, can be deemed indiscriminate on account of the biases they incorporate and their inability to identify the features of persons with disabilities and other marginalized groups. In other words, they may be indiscriminate by nature – as is the case with anti-personnel mines and cluster munitions – and disproportionately affect persons with disabilities.

Secondly, persons with disabilities and other marginalized groups, particularly those in the global South and those that are currently experiencing conflict, would probably be the first to suffer the effects of autonomous weapons. The perspectives of the people at risk need to be at the centre of the discussions, not the interests of military powers. We must recognize that there is an asymmetry in the impact of autonomous weapons on different populations. It is easy – and even self-serving – to assert that new legally binding rules on autonomous weapons are not necessary when one knows that one's country would not be the place where such weapons would, at least in principle, be tested and used,Footnote 92 or when one does not belong to any of the marginalized groups that would be most affected.

Thirdly, it is essential to include organizations of persons with disabilities in discussions on autonomous weapons so that humanitarian and disability perspectives are incorporated more systematically. However, at least since 2019, not a single organization of persons with disabilities has made an intervention at the CCW, and we have found no evidence of their being included in consultations at the national level. All of this is extremely concerning.

International debates show that, in general, States still have no real interest in listening to persons with disabilities. Most fail to include both persons with disabilities and representatives of organizations of persons with disabilities in their own delegations, and to include them in national consultations on different issues, including autonomous weapons and AI. Let us say it clearly: this is not only a question of “improving” the data sets to include disability, it is a question of ensuring that persons with disabilities participate in the forums where the acceptability of specific uses of these technologies is discussed, where the decisions on the need for new international instruments are taken. By failing to ensure participation of and consultation with organizations of persons with disabilities, States miss out on an opportunity to include broad perspectives derived from the experiences of this population.

The main objective of the international community should be to strengthen efforts to ensure greater protection for civilians, in particular marginalized groups such as persons with disabilities. We must not wait until we have hundreds or thousands of casualties to adopt new international law on autonomous weapons systems, as was the case with other weapons – on the contrary, we must take urgent action in response to the risks that have already been identified. States have the moral imperative and the responsibility to launch, urgently, an effective negotiation process for an instrument on autonomous weapons systems. After the discussions in the framework of the CCW in the past two years, including its Review Conference, it seems impossible that such an instrument will be achieved in that forum.

Conclusions

In this article we have shown that persons with disabilities would be disproportionately affected by the use of autonomous weapons because of the systemic discrimination and barriers they continue to face in various areas. This discrimination would be replicated and exacerbated by autonomous weapons, with appalling and unacceptable consequences that would threaten the right to life itself. Indeed, we believe that analyzing the logic and dynamics of war goes beyond simply describing its forms and the ways in which it unfolds. To fully understand its impact, we must analyze the disproportionate impact of war and certain weapons on persons with disabilities and other marginalized groups; the contexts in which such weapons are developed and would be used; and the systemic discriminations that still permeate our societies.

The factors that reproduce and reinforce discrimination against persons with disabilities include biases in data and decision-making related to AI, violence by military and police forces, and lack of equal access to humanitarian aid and to justice and reparation mechanisms. Such discrimination is equally reflected in the debates on autonomous weapons, where there is a lack of encouragement of and support for the participation of organizations representing persons with disabilities, both at the national and international levels.

However, we need to take a much broader view of this topic. Considering it acceptable, ethical, legitimate and legal to delegate life-and-death decisions to autonomous systems – a serious issue in itself – would have major implications for our relationship with technology. If we believe that the right to life can be delegated to autonomous technology, why should we not also delegate the right to social protection, employment, health or justice to such systems? We must ask ourselves who stands to win and to lose as a result of these decisions, and of the lack of action and legislation in this area.

We must also remember that the face of another person reveals to us their suffering, their fragility. Faces have the power to generate moral imperatives, to awaken our common sense, our emotions. What happens to responsibility, compassion, empathy, shame, a sense of injustice and humanity itself, if we use autonomous weapons?

Autonomous weapons do not benefit the majority of nations, nor the majority of populations. On the contrary, as Bengio notes, autonomous weapons can be seen as an example of AI functioning as “a tool that can be used by those in power to keep that power, and to increase it”.Footnote 93 As investments in AI for autonomy in weapons systems increase, from a civil society standpoint, we would like to see investments of this magnitude directed towards sectors that really contribute to ensuring human security and combating inequality: inclusive education, universal social protection, accessible health services, access to justice, response to gender-based violence, gender equality and climate justice.

As of September 2022,Footnote 94 eighty-three States have called for a legally binding instrument on these weapons. The ICRC itself has reiterated that “new legally binding rules on autonomous weapons are urgently needed”,Footnote 95 joining civil society and many other stakeholders in this call. Yet, the CCW has still not taken any concrete steps towards this goal. We therefore urge countries that are truly committed to human rights, the protection of civilians and IHL to initiate a process in some other forum, where a treaty that guarantees meaningful human control over the use of force, through prohibitions and regulations, can be negotiated in good faith. Such a process must truly place the ethical, humanitarian and human rights concerns evoked in this article at its centre and must include the views and contributions of persons with disabilities and other marginalized groups.

Autonomy in the critical functions of weapons systems is on the rise, and inaction is not neutral: it benefits those who are currently developing these weapons. Every day that passes without a legally binding instrument on autonomous weapons is another day that normalizes and enables their development and future use. How much longer are we going to wait? Who will be responsible for the future victims of autonomous weapons resulting from inaction on the part of the international community?

Footnotes

*

This article was self-financed by the authors with a contribution from the Campaign to Stop Killer Robots.

The advice, opinions and statements contained in this article are those of the author/s and do not necessarily reflect the views of the ICRC. The ICRC does not necessarily represent or endorse the accuracy or reliability of any advice, opinion, statement or other information provided in this article.

References

1 Philip Alston, Interim Report of the Special Rapporteur on Extrajudicial, Summary or Arbitrary Executions, UN Doc. A/65/321, 23 August 2010, para. 28, available at: https://digitallibrary.un.org/record/690463?ln=en (all internet references were accessed in October 2022).

2 Christof Heyns, Report of the Special Rapporteur on Extrajudicial, Summary or Arbitrary Executions, UN Doc. A/HRC/23/47, 9 April 2013, pp. 1–3, 5, available at: www.ohchr.org/sites/default/files/Documents/HRBodies/HRCouncil/RegularSession/Session23/A-HRC-23-47_en.pdf.

3 It is important to note that while the mandate of the GGE on LAWS relates to “lethal” autonomous weapons systems, throughout this article we use the term “autonomous weapons” or “autonomous weapons systems” recognizing that, as the ICRC has explained, lethality is not an inherent property of a weapon, but depends on the weapon and the context of its use. For details, see International Committee of the Red Cross (ICRC), “CCW Meeting of Experts on Autonomous Systems: Session on Technical Issues”, 14 May 2014, available at: https://docs-library.unoda.org/Convention_on_Certain_Conventional_Weapons_-_Informal_Meeting_of_Experts_(2014)/ICRC%2BLAWS%2B2014%2Btechnical%2Baspects.pdf.

4 Timothy McFarland, “The Status of Autonomous Weapon Systems under International Humanitarian Law”, PhD thesis, Melbourne Law School, University of Melbourne, April 2017, available at: https://rest.neptune-prod.its.unimelb.edu.au/server/api/core/bitstreams/f1a35baf-3656-5627-bd81-b7d1862e5f02/content.

5 ICRC, ICRC Position on Autonomous Weapon Systems, May 2021, p. 2, available at: https://shop.icrc.org/icrc-position-on-autonomous-weapon-systems-pdf-en.html.

6 See, for example, ibid.; Article 36, “Targeting People”, November 2019, available at: https://article36.org/wp-content/uploads/2019/11/targeting-people.pdf; Anna Turek and Richard Moyes, “Sensor-Based Targeting Systems: An Option for Regulation”, Article 36, November 2021, available at: https://article36.org/wp-content/uploads/2022/01/Sensor-based-targeting.pdf; Vincent Boulanin, Laura Bruun and Netta Goussac, Autonomous Weapon Systems and International Humanitarian Law: Identifying Limits and the Required Type and Degree of Human-Machine Interaction, Stockholm International Peace Research Institute, June 2021, available at: www.sipri.org/sites/default/files/2021-06/2106_aws_and_ihl_0.pdf; Human Rights Watch, Heed the Call: A Moral and Legal Imperative to Ban Killer Robots, August 2018, available at: www.hrw.org/report/2018/08/21/heed-call/moral-and-legal-imperative-ban-killer-robots.

7 See, for example, Hayley Ramsay-Jones, Racism and Fully Autonomous Weapons, submission to the UN Special Rapporteur regarding the thematic report on new information technologies, 29 January 2020, available at: https://bit.ly/3G2hfdS; Marissa Conway, Smashing the Patriarchy: The Feminist Case Against Killer Robots, Centre for Feminist Foreign Policy, London, August 2020, available at: https://bit.ly/38GwLPJ; Republic of Costa Rica, Republic of Panama, Republic of Peru, Republic of the Philippines, Republic of Sierra Leone and Eastern Republic of Uruguay, Joint Working Paper, available at: https://documents.unoda.org/wp-content/uploads/2021/06/Costa-Rica-Panama-Peru-the-Philippines-Sierra-Leone-and-Uruguay.pdf.

8 See, for instance, Acheson, Ray, “Editorial: Multilateralism vs. Consensus in the Quest for a Mandate”, CCW Report, Vol. 9, No. 12, 2021, p. 2Google Scholar, available at: https://reachingcriticalwill.org/images/documents/Disarmament-fora/ccw/2021/RevCon/reports/CCWR9.12.pdf; Wanda Muñoz and Mariana Díaz, The Risks of Autonomous Weapons: An Intersectional Analysis, SEHLAC, 2021, available at: https://bit.ly/3vh7fZb.

9 See, for instance, Gerard Quinn, Report of the Special Rapporteur on the Rights of Persons with Disabilities, UN Doc. A/HRC/49/52, 28 December 2021, available at: www.ohchr.org/en/documents/thematic-reports/ahrc4952-artificial-intelligence-and-rights-persons-disabilities-report; Wanda Muñoz Jaime and Mariana Díaz Figueroa, “Armas autónomas: La Inaceptable reproducción de sistemas de opresión en tecnología militar”, in Aleida Fernández, Clara Duarte and Dora Inés Munévar (eds), Discapacidad, conflicto armado y construcción de paz, Universidad Nacional de Colombia, Centro Editorial Facultad de Medicina, Bogotá, March 2021, available at: https://repositorio.unal.edu.co/handle/unal/79707.

10 Gerard Quinn, Report of the Special Rapporteur on the Rights of Persons with Disabilities, UN Doc. A/76/146, 19 July 2021, p. 9, available at: https://daccess-ods.un.org/access.nsf/Get?OpenAgent&DS=A/76/146&Lang=E.

11 G. Quinn, above note 9, p. 13.

12 For more on intersectionality, see Association for Women's Rights in Development, Intersectionality: A Tool for Gender and Economic Justice, Women's Rights and Economic Change No. 9, August 2004, available at: www.intergroupresources.com/rc/Intersectionality%20-%20a%20Tool%20for%20Gender%20&%20Economic%20Justice.pdf.

13 Convention on the Rights of Persons with Disabilities, UN Doc. CRPD/C/GC/6, 26 April 2018 (CRPD), p. 5, available at: https://digitallibrary.un.org/record/1626976?ln=en#record-files-collapse-header.

14 Ray Acheson, Autonomous Weapons and Patriarchy, Women's International League for Peace and Freedom, 2020, p. 4, available at: https://reachingcriticalwill.org/images/documents/Publications/aws-and-patriarchy.pdf.

15 Paola Ricaurte et al., “AI Decolonial Manyfesto”, 2021, available at: https://manyfesto.ai/index.html.

16 Meredith Whittaker et al., “Disability, Bias, and AI”, AI Now Institute, November 2019, p. 8, available at: https://ainowinstitute.org/disabilitybiasai-2019.pdf.

18 For more on positive impacts of AI and areas for future work, see Future Society and Global Partnership on AI, Areas for Future Action in the Responsible AI Ecosystem, December 2020, available at: https://gpai.ai/projects/responsible-ai/areas-for-future-action-in-responsible-ai.pdf.

19 G. Quinn, above note 9.

20 Ibid., p. 4.

21 UNESCO, Recommendation on the Ethics of Artificial Intelligence, p. 4, available at: https://unesdoc.unesco.org/ark:/48223/pf0000380455.

22 Buolamwini, Joy and Gebru, Timnit, “Gender Shades: Intersectional Accuracy Disparities in Commercial Gender Classification”, Proceedings of Machine Learning Research, Vol. 81, 2018Google Scholar, available at: www.media.mit.edu/publications/gender-shades-intersectional-accuracy-disparities-in-commercial-gender-classification/. Regarding AI and gender and racial bias, see also Shalini Kantanya, “Coded Bias”, Algorithmic Justice League, available at: www.ajl.org/spotlight-documentary-coded-bias. For more on how AI decision-making can lead to discrimination, and for examples in specific sectors, see Frederick Zuiderveen Borgesius, Discrimination, Artificial Intelligence, and Algorithmic Decision-Making, Council of Europe, 2018, available at: https://rm.coe.int/discrimination-artificial-intelligence-and-algorithmic-decision-making/1680925d73.

23 M. Whittaker et al., above note 16, p. 7.

24 To learn more about systems of oppression, multiple discrimination and structural inequalities, see, for example, Egale, “Terms and Definitions: Systems of Oppression and Privilege”, available at: https://egale.ca/awareness/systems-of-oppression-and-privilege-terms/. On disability, multiple discrimination and intersectionality, see CRPD, above note 13.

25 Catalina Devandas-Aguilar, Report of the Special Rapporteur on the Rights of Persons with Disabilities, UN Doc. A/HRC/43/41, 17 December 2019, p. 3, available at: https://daccess-ods.un.org/access.nsf/Get?OpenAgent&DS=A/HRC/43/41&Lang=E.

26 M. Whittaker et al., above note 16.

27 For examples and further information concerning possible bias in autonomous weapons, see SEHLAC, Autonomous Weapons Systems: An Analysis from Human Rights, Humanitarian and Ethical Artificial Intelligence Perspectives, 2021, available at: https://bit.ly/3lMC8l5.

28 For a comprehensive review of the rights of persons with disabilities in the context of IHL, see ICRC, “IHL and Persons with Disabilities”, legal fact sheet, 4 October 2017, available at: www.icrc.org/en/document/ihl-and-persons-disabilities. Also see the articles by Janet Lord and Alex Breitegger in this issue of the Review.

29 Geneva Convention (IV) relative to the Protection of Civilian Persons in Time of War of 12 August 1949, 75 UNTS 287 (entered into force 21 October 1950), Arts 16, 27, available at: https://ihl-databases.icrc.org/ihl/INTRO/380.

30 G. Quinn, above note 10, p. 13.

31 Protocol Additional (I) to the Geneva Conventions of 12 August 1949, and relating to the Protection of Victims of International Armed Conflicts, 1125 UNTS 3, 8 June 1977 (entered into force 7 December 1978), available at: https://ihl-databases.icrc.org/ihl/WebART/470-750013?OpenDocument.

32 ICRC, “Conflictos internos u otras situaciones de violencia: ¿Cuál es la diferencia para las víctimas?”, 10 December 2012, available at: www.icrc.org/es/doc/resources/documents/interview/2012/12-10-niac-non-international-armed-conflict.htm.

33 CRPD, above note 13, Art. 11.

34 Arthur Holland Michel, “Known Unknowns: Data Issues and Military Autonomous Systems”, GGE on LAWS, 10 August 2021, available at: https://unidir.org/events/known-unknowns-data-issues-and-military-autonomous-systems.

35 G. Quinn, above note 9, p. 13.

36 W. Muñoz Jaime and M. Díaz Figueroa, above note 9.

37 Such problems with facial and iris recognition are currently present in electronic devices with such features.

38 The authors have heard first-hand accounts of persons with disabilities who have been denied the ability to open a bank account because of the impossibility of taking their fingerprints or other biometric data, including voice recognition.

39 SEHLAC and Colombian Campaign to Ban Landmines, “Moises Vinacudo, de la @COL_SIN_MINAS nos habla sobre cómo las #armasautónomas pueden tener un impacto desproporcionado y diferenciado en las comunidades #indígenas”, Twitter, 7 September 2021, available at: https://tinyurl.com/577hmep6.

40 M. Whittaker et al., above note 16.

41 Paola Ricaurte, speaking at the panel on “Los riesgos de las armas autónomas y el rol de la comunidad científica” (“The Risks of Autonomous Weapons and the Role of the Scientific Community”), Reunión International de Inteligencia Artificial, 27 August 2021.

42 See, for example, UNSC Res. 2475, 20 June 2019, available at: https://daccess-ods.un.org/access.nsf/Get?OpenAgent&DS=s/res/2475(2019)&Lang=E; United Nations Office for Disaster Risk Reduction, Living with Disability and Disasters: UNISDR 2013 Survey on Living with Disabilities and Disasters – Key Findings, 2014, available at: www.unisdr.org/2014/iddr/documents/2013DisabilitySurveryReport_030714.pdf.

43 Juliana Bustamante, “¿Qué les pasó en la guerra a las personas con discapacidad?”, El Espectador, 31 May 2019 (authors’ translation), available at: www.elespectador.com/colombia-20/analistas/que-les-paso-en-la-guerra-a-las-personas-con-discapacidad-article/.

44 Program of Action for Equality and Social Inclusion, Discapacidad y conflicto armado en Colombia: En busca de un relato ausente, Bogotá, 2020, available at: https://paiis.uniandes.edu.co/wp-content/uploads/web_Discapacidad-y-conflicto-armado-en-Colombia-en-busca-de-un-relato-ausente.pdf.

45 Special Jurisdiction for Peace, “Principales estadísticas”, 11 February 2022, available at: www.jep.gov.co/jepcifras/JEP%20en%20cifras%20-%20febrero%2011%20de%202022.pdf#search=6402.

46 Special Jurisdiction for Peace, “JEP imputa crímenes de guerra y de lesa humanidad a un general, 6 oficiales y 3 suboficiales del ejército, y a un tercero civil, por ‘falsos positivos’ en Catatumbo”, 2021, available at: www.jep.gov.co/Sala-de-Prensa/Paginas/JEP-imputa-cr%C3%ADmenes-de-guerra-y-de-lesa-humanidad-a-10-militares-y-un-civil-por-%27falsos-positivos%27-en-Catatumbo.aspx.

47 A. Hussein Abdel Rahman Hammad, The Suffering of Persons with Disabilities from the Violations of Israeli Occupation Forces during the Operation Protective Edge, 2014, available at: www.map.org.uk/downloads/thesuf1.pdf.

48 Human Rights Watch, “Persons with Disabilities in the Context of Armed Conflict: Submission to the UN Special Rapporteur on the Rights of Persons with Disabilities”, 8 June 2021, available at: www.hrw.org/news/2021/06/08/persons-disabilities-context-armed-conflict.

49 Mary Wareham, “Don't Arm Robots in Policing”, Human Rights Watch, 24 March 2021, available at: www.hrw.org/news/2021/03/24/dont-arm-robots-policing.

50 See, for instance, Joe Sabala, “Israel Deploys Semi-Autonomous Machine Gun Robot to Gaza Border”, Defense Post, 1 July 2021, available at: www.thedefensepost.com/2021/07/01/israel-machine-gun-robot-gaza-border/.

51 See, for instance, Marvin Mack, “How America's State Police Got Military Weapons”, Insider, 28 April 2021, available at: www.businessinsider.com/how-did-local-police-acquire-surplus-military-weapons-2020-8.

52 David M. Perry and Lawrence Carter-Long, The Ruderman White Paper on Media Coverage of Law Enforcement Use of Force and Disability: A Media Study (2013–2015) and Overview, Ruderman Family Foundation, March 2016, available at: https://rudermanfoundation.org/wp-content/uploads/2017/08/MediaStudy-PoliceDisability_final-final.pdf.

53 Abigail Abrams, “Black, Disabled and at Risk: The Overlooked Problem of Police Violence against Americans with Disabilities”, Time, 25 June 2020, available at: https://time.com/5857438/police-violence-black-disabled/.

54 Ilan Ben Zion, “Officer May Face Charges in Killing of Autistic Palestinian”, AP News, 21 October 2020, available at: https://tinyurl.com/4vc6xswp.

55 Butler, Judith, Frames of War: When Is Life Grievable?, Verso, London and New York, 2009, p. 2Google Scholar.

56 Fore more on the purpose and work of RED-LAT, see RED-LAT, Declaración de la Red Latinoamericana de Personas con Discapacidad y Sobrevivientes de Accidentes por Minas Antipersonal y Restos Explosivos de Guerra, Bogotá, Colombia, 17 November 2017, available at: www.cud.unlp.edu.ar/uploads/docs/declaracion_bogota_red_de_sobrevivientes_de_map__reg_y_pcd.pdf.

57 RED-LAT and Humanity & Inclusion, Aportes de la RED-LAT Red Latinoamericana de Sobrevivientes de Minas Antipersonal, Restos Explosivos de Guerra y otras Personas con Discapacidad a la reducción de la violencia en la región de América Latina, July 2021, available at: https://tinyurl.com/3s6f97dp.

58 Ibid., p. 11 (authors’ translation).

59 Inter-American Court of Human Rights, Cuadernillo de jurisprudencia de la Corte Interamericana de Derechos Humanos, No. 4, 2018, p. 9 (authors’ translation), available at: www.corteidh.or.cr/sitios/libros/todos/docs/cuadernillo4.pdf.

61 To learn more about these systems, see the references cited in note 24. On heteronormativity, see European Institute for Gender Equality, “Heteronormativity”, available at: https://eige.europa.eu/thesaurus/terms/1237; Stephen Wood, “Heteronormativity”, Eldis, available at: https://www.eldis.org/keyissues/heteronormativity. On patriarchy and autonomous weapons systems, see Ray Acheson, “Feminist Perspectives on Autonomous Weapon Systems”, Reaching Critical Will, 2020, available at: www.reachingcriticalwill.org/resources/publications-and-research/publications/14975-feminist-perspectives-on-autonomous-weapon-systems; M. Conway, above note 7. On colonialism (and specifically its link to AI) and current efforts on “decolonial AI”, see Katharine Miller, The Movement to Decolonize AI: Centering Dignity Over Dependency, Stanford University, Human-Centered Artificial Intelligence, 21 March 2022, available at: https://hai.stanford.edu/news/movement-decolonize-ai-centering-dignity-over-dependency. On violence and discrimination based on sexual orientation and gender identity, see Margalit, Alon, “Still a Blind Spot: The Protection of LGBT Persons during Armed Conflict and Other Situations of Violence”, International Review of the Red Cross, Vol. 100, No. 907–909, 2018CrossRefGoogle Scholar, available at: www.corteidh.or.cr/tablas/r39345.pdf.

62 See, for example, Daan Kayser, Increasing Autonomy in Weapons Systems: 10 Examples that Can Inform Thinking, Automated Decision Research Project of Stop Killer Robots, in conjunction with PAX, December 2021, available at: https://bit.ly/3uJdRAt; Zachary Kallenborn, “Russia May Have Used a Killer Robot in Ukraine. Now What?”, Bulletin of the Atomic Scientists, 15 March 15, available at: https://thebulletin.org/2022/03/russia-may-have-used-a-killer-robot-in-ukraine-now-what/.

63 Centre for Global Challenges, “The Intimacies of Remote Warfare”, Utrecht University, available at: www.uu.nl/en/organisation/centre-for-global-challenges/projects/the-intimacies-of-remote-warfare.

64 Abigail Watson and Alasdair McKay, “Remote Warfare: A Critical Introduction”, in Alasdair McKay, Abigail Watson and Megan Karlshøj-Pedersen (eds), Remote Warfare: Interdisciplinary Perspectives, E-International Relations, Bristol, 2021, p. 7, available at: www.e-ir.info/publication/remote-warfare-interdisciplinary-perspectives/, citing Emily Knowles and Abigail Watson, Remote Warfare: Lessons Learned from Contemporary Theatres, Oxford Research Group Remote Warfare Programme, June 2018, available at: www.researchgate.net/publication/327070322_Remote_Warfare_Lessons_Learned_from_Contemporary_Theatres.

65 Baraa Shiban and Camilla Molyneux, “The Human Cost of Remote Warfare in Yemen”, in A. McKay, A. Watson and M. Karlshøj-Pedersen (eds), above note 64.

67 For more on persons with psychosocial disabilities in war-affected settings, see Hanna Kienzler, Suzan Mitwalli and Meryem Cicek, “The Experience of People with Psychosocial Disabilities of Living Independently and Being Included in the Community in War-Affected Settings: A Review of the Literature”, International Journal of Law and Psychiatry, Vol. 81, March–April 2022, available at: www.sciencedirect.com/science/article/pii/S0160252721000935?via%3Dihub. For more on women and girls with disabilities in crises and conflicts, see Brigitte Rohwerder, “Women and Girls with Disabilities in Conflict and Crises”, K4D Helpdesk Report, 16 January 2017, available at: https://gsdrc.org/wp-content/uploads/2017/10/032-Women-and-girls-with-disabilities-in-crisis-and-conflict.pdf. On deaf persons in armed conflict, see World Federation of the Deaf, “Guidelines for the Protection and Safety of Deaf People in Armed Conflicts”, available at: https://wfdeaf.org/news/resources/guidelines-for-the-protection-and-safety-of-deaf-people-in-armed-conflicts-2/.

68 Human Rights Watch, Stopping Killer Robots: Country Positions on Banning Fully Autonomous Weapons and Retaining Human Control, 2020, available at: www.hrw.org/sites/default/files/media_2021/04/arms0820_web_1.pdf.

69 Jolle Demmers, speaking on “Saferworld's Warpod, Episode 8: Remote Warfare: Interdisciplinary Perspectives”, Saferworld, available at: www.saferworld.org.uk/multimedia/saferworldas-warpod-episode-8-remote-warfare-interdisciplinary-perspectives.

70 Handicap International, Disability in Humanitarian Contexts: Views from Affected People and Field Organisations, Lyon, 2015, p. 4, available at: www.un.org/disabilities/documents/WHS/Disability-in-humanitarian-contexts-HI.pdf.

71 Alice Priddy, Disability and Armed Conflict, Academy Briefing No. 14, Geneva Academy, Geneva, April 2019, available at: www.geneva-academy.ch/joomlatools-files/docman-files/Academy%20Briefing%2014-interactif.pdf.

72 Bonnie Docherty, Mind the Gap: The Lack of Accountability for Killer Robots, Human Rights Watch, August 2015, available at: www.hrw.org/report/2015/04/09/mind-gap/lack-accountability-killer-robots.

73 A. Priddy, above note 71.

74 Siddharthya Roy and Richard Miniter, “Taliban Kill Squad Hunting Down Afghans – Using US Biometric Data”, New York Post, 27 August 2021, available at: https://nypost.com/2021/08/27/taliban-kill-squad-hunting-afghans-with-americas-biometric-data/; Eric Schmitt, “No U.S. Troops Will Be Punished for Deadly Kabul Strike, Pentagon Chief Decides”, New York Times, 13 December 2021, available at: www.nytimes.com/2021/12/13/us/politics/afghanistan-drone-strike.html.

75 María Antonia Sánchez Vallejo, “El Pentágono reconoce como un ‘trágico error’ el ataque con dron que mató a diez civiles durante la evacuación de Kabul”, El País, 17 September 2021, available at: https://elpais.com/internacional/2021-09-17/el-pentagono-reconoce-como-un-tragico-error-el-ataque-con-dron-que-mato-a-diez-civiles-durante-la-evacuacion-de-kabul.html.

76 “US Will not Punish Troops for Deadly Kabul Drone Attack”, Al Jazeera, 13 December 2021, available at: www.aljazeera.com/news/2021/12/13/us-will-not-punish-troops-over-deadly-kabul-drone-attack-reports.

77 “Condolence payments” was the term used by the US authorities. See Gibran Naiyyar Peshiman, “Afghan Family Decimated by US Drone Strike Awaits Justice from Washington”, Reuters, 10 November 2021, available at: www.reuters.com/world/asia-pacific/afghan-family-decimated-by-us-drone-strike-awaits-justice-washington-2021-11-10/.

78 G. N. Peshiman, above note 77.

79 For more on the technical characteristics of autonomous weapons systems, see Arthur Holland Michel, The Black Box, Unlocked: Predictability and Understandability in Military AI, United Nations Institute for Disarmament Research, 2020, available at: https://unidir.org/sites/default/files/2020-09/BlackBoxUnlocked.pdf; Anna Turek and Richard Moyes, Autonomy in Weapons: “Explicability” as a Way to Secure Accountability, Article 36, December 2020, available at: https://article36.org/wp-content/uploads/2020/12/Explicability-and-accountability.pdf.

80 V. Boulanin, L. Bruun and N. Goussac, above note 6, p. 40.

81 Brian Stauffer, “Can Algorithms Save Us from Human Error? Human Judgment and Responsibility in the Age of Technology”, Human Rights Watch, 2019, available at: www.hrw.org/world-report/2019/country-chapters/global-1.

82 B. Docherty, above note 72.

83 The text of the Convention on the Prohibition of the Use, Stockpiling, Production and Transfer of Anti-Personnel Mines and on Their Destruction is available at: www.apminebanconvention.org/en/the-convention/history-and-text/.

84 The text of the Convention on Cluster Munitions is available at: www.clusterconvention.org/convention-text/.

85 The text of the Treaty on the Prohibition of Nuclear Weapons is available at: https://treaties.un.org/doc/Treaties/2017/07/20170707%2003-42%20PM/Ch_XXVI_9.pdf.

86 On survivors’ participation in the processes that led to the ban on anti-personnel mines, see, for instance, White, Jerry and Rutherford, Ken, “The Role of the Landmine Survivors Network”, in Cameron, Maxwell A., Lawson, Tobert J. and Tomlin, Brian W. (eds), To Walk without Fear: The Global Movement to Ban Landmines, Oxford University Press, Toronto, 1998Google Scholar.

87 John Borrie, Unacceptable Harm: A History of How the Treaty to Ban Cluster Munitions Was Won, United Nations Institute for Disarmament Research, 2009, p. xviii, available at: www.unidir.org/sites/default/files/publication/pdfs/unacceptable-harm-a-history-of-how-the-treaty-to-ban-cluster-munitions-was-won-en-258.pdf.

88 For more on victim assistance in the context of the Convention on Cluster Munitions, see Convention on Cluster Munitions, “Victim Assistance”, 2022, available at: www.clusterconvention.org/victim-assistance/; Markus, Reiterer, “Assisting Cluster Munition Victims: A New International Standard”, Journal of ERW and Mine Action, Vol. 15, No. 1, 2001Google Scholar, available at: https://commons.lib.jmu.edu/cisr-journal/vol15/iss1/15.

89 Human Rights Watch, Meeting the Challenge: Protecting Civilians through the Convention on Cluster Munitions, November 2010, available at: www.hrw.org/sites/default/files/reports/armsclusters1110webwcover.pdf.

90 These organizations include the Afghan Landmine Survivors’ Organization, available at: http://afghanlandminesurvivors.org/en/; RED-LAT, available at: www.facebook.com/people/Red-Latinoamericana-de-Sobrevivientes-de-Map-Reg-y-Pcd/100066805080563/; the Fundación Red de Sobrevivientes y Personas con Discapacidad, El Salvador, available at: https://reddesobrevivientes.org; and the World Nuclear Survivors Forum 2021, available at: https://nuclearsurvivors.org. For survivor networks around the world, see: https://survivornetworks.wordpress.com/networks/.

91 Their activities included awareness-raising and advocacy on the impact of these weapons on men, women, girls and boys in affected communities, based on lived experience; advocacy activities at different levels to inform and encourage ministries of foreign affairs to negotiate a strong treaty text; promotion of national and regional conferences to discuss the human rights and economic impact of such weapons; awareness-raising among the general population regarding the negative impact of these weapons and the need for governments to negotiate prohibition treaties and ratify them; contributions to victim assistance and risk awareness in affected communities; and advocacy based on this field experience. In addition, during the Oslo Process, survivors’ contributions were fundamental to developing the article on victim assistance and the inclusion of victim assistance as a cross-cutting matter throughout the Convention on Cluster Munitions. This list is based on the lived experience of Jesús Martínez in the Ottawa and Oslo Processes, and the experience of Wanda Muñoz Jaime in the Oslo Process.

92 R. Acheson, above note 8.

93 Davide Castelvecchi, “AI Pioneer: ‘The Dangers of Abuse Are Very Real’”, Nature, 4 April 2019, available at: www.nature.com/articles/d41586-019-00505-2.

94 Automated Decision Research, “State Positions”, available at: https://automatedresearch.org/state-positions/?_state_position_negotiation=yes.

95 ICRC, “Autonomous Weapons: The ICRC Calls on States to Take Steps towards Treaty Negotiations”, 8 August 2022, available at: www.icrc.org/en/document/autonomous-weapons-icrc-calls-states-towards-treaty-negotiations.