Hostname: page-component-78c5997874-m6dg7 Total loading time: 0 Render date: 2024-11-17T18:19:05.723Z Has data issue: false hasContentIssue false

Drone Visuals and Compliance with IHL

Published online by Cambridge University Press:  24 March 2023

Shiri Krebs*
Affiliation:
Professor of Law, Deakin Law School; Chair, Lieber Society on the Law of Armed Conflict, American Society og International Law (ASIL); Co-Lead, Law and Policy Theme, Cyber Security Cooperative Research Centre (CSCRC); Affiliate Scholar, Stanford Center on International Security and Cooperation (CISAC). An earlier, pre-presentation draft of this contribution was published as Shiri Krebs, Through the Drone Looking Glass: Visualisation Technologies and Military Decision-Making, Articles of War (Feb. 11, 2022), at https://lieber.westpoint.edu/visualization-technologies-military-decision-making.
Rights & Permissions [Opens in a new window]

Abstract

Type
New Voices in International Law: Personalizing International Law in Times of Crisis
Copyright
Copyright © The Author(s), 2023. Published by Cambridge University Press on behalf of The American Society of International Law

I. Introduction

On August 29, 2021, the U.S. military launched its last drone strike in Afghanistan before American troops withdrew from the country.Footnote 1 The strike targeted a white Toyota Corolla near Kabul's international airport, driven by Zemari Ahmadi, believed to be carrying an ISIS bomb. As a result of the strike, the targeted vehicle was destroyed and ten people were killed. The U.S. military called it a “righteous strike,” explaining that it was necessary to prevent an imminent threat to American troops at Kabul's airport.Footnote 2 However, following the findings of a New York Times investigation,Footnote 3 a high-level U.S. Air Force investigation found that the targeted vehicle did not pose any danger and that all ten casualties were civilians, seven of them children. Despite these outcomes, the investigation concluded that the strike did not violate any law, because it was a “tragic mistake” resulting from “inaccurate” interpretation of the available intelligence.Footnote 4 The investigation suggested that the wrong—and lethal—interpretation of the intelligence—which included eight hours of drone visuals—resulted from “execution errors” combined with “confirmation bias.”

Using cognitive insights, such as confirmation bias, to explain—and excuse—military errors resulting in civilian casualties, is a step forward, but not necessarily in the right direction. It is a step forward in the sense that it recognizes significant cognitive dynamics that limit crucial military risk assessment and fact-finding processes. But this step will not lead to better outcomes without a deeper understanding of how existing data practices—including real-time drone visuals—are susceptible to, and affected by, cognitive biases. Stronger, more effective, protections to civilians in armed conflicts require acknowledging the core role drone visuals play in generating knowledge that is often perceived as objective—despite being distorted by technical, socio-technical, and cognitive dynamics.

In this presentation I aim to add these technological and behavioral elements in military knowledge production to the important discussions on compliance with international humanitarian law (IHL). Before delving into the substantive issues, it is important to clarify what I mean by “compliance.” Compliance is often invoked in the context of armed conflict as a technical-legalistic term which reflects a particular interpretation of the applicable legal rules. In this presentation, however, I explore compliance not within its very confined and context-dependent legalistic application, but rather as a humanistic term. Viewed through these lenses, I invoke the term “compliance” here to signify the core declared aim of this legal regime, which is to maintain respect for the law of armed conflict (LOAC) for the purpose of mitigating the harms of war and protecting humans, non-humans, and the environment during armed conflicts. This is especially important because when the concrete scope and interpretation of legal rules are deeply contested—as are most of the core IHL rules—compliance in its narrow technical-legalistic sense becomes almost meaningless. I will also use, interchangeably, the terms IHL and LOAC, signaling that this discussion transcends the existing interpretive “camps,” and that the law's protective goals ultimately contribute to human security everywhere.

II. Visualization Technologies and Compliance with IHL/LOAC

My research into the effects of visualization technologies on military decision making identifies several compliance-related challenges that stem from reliance on visualization technologies and can be explained—and improved—using behavioral insights. Visual technologies may influence the relevant legal standards, shaping the meaning of “reasonable commander” and constructing the scope of the legal burdens of care.Footnote 5

An awareness of the effects of cognitive biases on the interpretation of drone visuals may influence the scope of the duties to “do everything feasible” to verify the target identification and to avoid or minimize collateral damage.Footnote 6 For example, meaningful precaution may require mitigating systemic errors deriving from biased interpretation of drone visuals through various debiasing techniques. Additionally, the visible outputs of visualization technologies and the invisible biases involved in their interpretation may amplify pre-existing vulnerabilities in the legal standards and in particular, their murky standards of proof.Footnote 7

Addressing the debate surrounding the required level of certainty on targeting decisions, Tom Oakley points out that there is a knowledge gap concerning the legal requirement.Footnote 8 Indeed, in arguing against the “reasonable certainty” standard and supporting the “near certainty” one, Michael Adams and Ryan Goodman demonstrate that the level of certainty required by LOAC is nothing but certain.Footnote 9 But even if the standard itself was clear, behavioral insights teach us that decisionmakers’ level of certainty may be unconsciously affected by a number of cognitive processes leading to misinterpretation of the available evidence, and to experts’ overconfidence in their biased analysis.Footnote 10

III. Limitations of Visualization Technologies

In the remainder of this presentation, I will focus on this last point, addressing the challenges relating to the effects of visualization technologies on military fact-finding processes, and exposing their technical, socio-technical, and cognitive constraints. I will do so using examples from military investigations in the United States and Israel, drawing attention to the invisible burdens these technologies place on decision-makers. As findings from these investigations show, visualization outputs create an imperfect, yet highly persuasive, virtual representation of the actual conditions on the ground; a representation that is difficult, if not impossible, to refute.

To clarify, my claim is not that military decision-making processes are better or more accurate without the aid of visualization technologies. These technologies indeed provide a large amount of essential information about the battlefield, target identification, and the presence of civilians in the vicinity of a planned attack. I also do not engage here with arguments, such as those made by Samuel Moyn and others, that precision weapons and visualization technologies, combined with sophisticated war lawyering, humanize armed conflicts and legitimize violence.Footnote 11 The argument, instead, is that the undeniable benefits of visualization technologies for military decision-making processes mask their blind spots: visualization technologies are imperfect and limited in several ways, which are not always visible to decisionmakers.

First, visualization technologies have technical and human-technical limitations, including insufficient or corrupt data inputs, blind spots, as well as time and space constraints. The missing details or corrupt information remain invisible, while the visible (yet limited or partial) outputs capture decisionmakers’ attention. Indeed, emerging empirical evidence suggests that real-time imaging outputs may reduce the situational awareness of decisionmakers, who tend to place an inappropriately high level of trust in visual data.Footnote 12 Additionally, technology systems may fail or malfunction.

When military practices rely profoundly on technology systems, decisionmakers’ own judgment, and their ability to evaluate evolving situations without the technology, erodes. The misidentification of the Doctors Without Borders hospital in Kunduz, Afghanistan, in October 2015 as a legitimate target—a decision that led to the killing of forty-two patients and hospital staff members—was partly attributed to the AC-130 aircrew's reliance on infrared visualization technology.Footnote 13 As this visualization technology is incapable of showing colors, it was incapable of depicting the red color of the hospital's red cross symbol, which could have alerted the aircrew that the intended target was a medical facility. Ashley Deeks points out that both a positive target identification, and an implicit approval by not alerting that the target is a protected target, may involve an automation bias, where individuals accept the machine's explicit or implicit recommendation.Footnote 14

Second, these technical (and human-technical) limitations create gaps in the available data. The need to fill these gaps makes military decision making “rife with subjectivity and speculation,” as Tomer Broude puts it.Footnote 15 Anne van Aaken emphasizes the relevance of bounded rationality theories, including concrete biases such as availability, anchoring and confirmation, to the application and interpretation of international law generally, and in particular in the context of armed conflicts.Footnote 16

Availability bias occurs when people overstate the likelihood that a certain event will occur because it is easily recalled, making decisionmakers less sensitive to information that runs contrary to their expectations. This means, that under some circumstances—for example, when depicted in areas where insurgents have been previously identified—individuals depicted in drone visuals may be more likely to be interpreted as insurgents rather than civilians. Anchoring bias occurs when the estimation of a condition is based on an initial value—anchor—that might result from intuition, a guess, or other easily recalled information. The problem is that decisionmakers do not adjust sufficiently from this initial anchoring point. Confirmation bias refers to people's tendency to seek out and act upon information that confirms their existing beliefs or interpret information in a way that validates their prior knowledge. As a result, the interpretation of drone visuals may be skewed based on decisionmakers’ existing expectations, and this confirmation may then serve as an (inaccurate) anchor for casualty estimates or target identification.

To demonstrate the potential effects of these cognitive biases on military decisionmakers, let us return to the August 29 attack on the white Toyota Corolla that killed Zemari Ahmadi, three of his children, and six other family members and neighbors. The investigation concluded that U.S. forces received information about a planned terror attack involving a white Toyota Corolla at a specified location near Kabul's international airport. Once that information was received, visuals of Mr. Ahmadi, who was driving a white Toyota Corolla, were interpreted consistently with this intelligence, and all of Mr. Ahmadi's following movements and actions were interpreted to affirm this suspicion.

Similarly, erroneous subjective judgments—likely affected by availability bias—were found to be the cause for an Israeli Defence Forces erroneous attack on civilians during Operation Cast Lead in January 2009.Footnote 17 On January 5, 2009, Israeli forces fired several projectiles at the Al-Samouni family house south of Gaza City, killing twenty-one civilians. The house was targeted following a drone visual which was misinterpreted as depicting five men holding RPG rockets at that location. An Israeli military investigation later found that the attack resulted from erroneous reading of the drone visual, which in fact depicted the five men holding firewood. The technical limitations of the image left room for human judgment, which inserted subjectivity—and cognitive biases—into a seemingly objective visual.Footnote 18 My ongoing work in this space provides qualitative evidence from several additional investigations.Footnote 19

IV. Strengthening Compliance

Based on this analysis, strengthening compliance with IHL/LOAC's protective goal (as opposed to its contested standards), must include a new program focused on the behavioral elements in its technology-based knowledge production practices. In particular, it is essential to identify how drone visuals affect human risk assessments, adding tailored protections against these unconscious challenges. These may include reconceptualization of the “duty of care” (as suggested by Moshe Hirsch in another context);Footnote 20 heightened visibility of internal disagreements about the interpretation of drone visuals; a rigorous inter-agency review process, with the goal of offering alternative interpretations (similar to the idea of “red teams” in investigative journalism); training sessions that identify the concrete limits and blind spots of the technology (including relevant biases, such as automation bias); and a shift from individual to organizational accountability for technology-related failures.

This last point can lead to better compliance as it encourages individuals to identify their own errors without fear of retaliation. Of course, ex post investigations are themselves influenced by a number of cognitive biases, including outcome bias, as Tomer Broude and Inbar Levy demonstrate.Footnote 21 In my contribution to Andrea Bianchi and Moshe Hirsch's International Law's Invisible Frames book, I propose legal, epistemological, and behavioral ways to strengthen ex post military investigations, with a particular emphasis on ex post fact-finding processes.Footnote 22

While drone visuals hold much promise for evidence driven risk assessments, visualization technologies may also jeopardize safety and security by masking data gaps and triggering unconscious cognitive biases. As governments around the world intensify their investments in sophisticated combat drones, it is essential to develop effective ways to better integrate these technologies into human decision-making processes, acknowledging the limitations of human cognition.

References

1 Matthieu Aikins, Times Investigation: In U.S. Drone Strike, Evidence Suggests No ISIS Bomb, N.Y. Times (Sept. 10, 2021), at https://www.nytimes.com/2021/09/10/world/asia/us-air-strike-drone-kabul-afghanistan-isis.html.

2 U.S. Dep't of Defense Press Release, Secretary of Defense Austin and Chairman of the Joint Chiefs of Staff Gen. Milley Press Briefing on the End of the U.S. War in Afghanistan (Sept. 1, 2021), at https://www.defense.gov/News/Transcripts/Transcript/Article/2762169/secretary-of-defense-austin-and-chairman-of-the-joint-chiefs-of-staff-gen-mille/.

3 Eric Schmitt, No U.S. Troops Will Be Punished for Deadly Kabul Strike, Pentagon Chief Decides, N.Y. Times (Dec. 13, 2021), at https://www.nytimes.com/2021/12/13/us/politics/afghanistan-drone-strike.html.

4 U.S. Dep't of Defense Press Release, Pentagon Press Secretary John F. Kirby and Air Force Lt. Gen. Sami D. Said Hold a Press Briefing (Nov. 3, 2021), at https://www.defense.gov/News/Transcripts/Transcript/Article/2832634/pentagon-press-secretary-john-f-kirby-and-air-force-lt-gen-sami-d-said-hold-a-p.

5 Michael N. Schmitt, Precision Attack and International Humanitarian Law, 87 Int'l Rev. Red Cross 445 (2005).

6 Protocol Additional to the Geneva Conventions of 12 August 1949, and Relating to the Protection of Victims of International Armed Conflicts (Protocol I), June 8, 1977, Art. 57.

7 Matthew C. Waxman, Detention as Targeting: Standards of Certainty and Detention of Suspected Terrorists, 108 Colum. L. Rev. 1365 (2008).

8 Tom Oakley, Closing the Gaps: Pre-deployment Role of the Military Legal Adviser, EJIL:Talk! (Nov. 23, 2021), at https://www.ejiltalk.org/closing-the-gaps-pre-deployment-role-of-the-military-legal-adviser.

9 Michael J. Adams & Ryan Goodman, “Reasonable Certainty” vs “Near Certainty” in Military Targeting–What the Law Requires, Just Security (Feb. 15, 2018) at https://www.justsecurity.org/52343/reasonable-certainty-vs-near-certainty-military-targeting-what-law-requires.

10 Paul Slovic, Baruch Fischhoff & Sarah Lichtenstein, Facts and Fears: Understanding Perceived Risk, in Societal Risk Assessment (Richard C. Schwing & Walter A. Albers eds., 1980).

11 Samuel Moyn, Humane: How the United States Abandoned Peace and Reinvented War (2022); Craig Jones, The War Lawyers (2020).

12 John McGuirl, Nadine B. Sarter & David D. Woods, Effects of Real-Time Imaging on Decision-Making in a Simulated Incident Command Task, 1 Int'l J. Info. Systems for Crisis Response & Mgmt. 54 (2009).

13 Matthew Rosenbergn, Pentagon Details Chain of Errors in Strike on Afghan Hospital, N.Y. Times (Apr. 29, 2016), at https://www.nytimes.com/2016/04/30/world/asia/afghanistan-doctors-without-borders-hospital-strike.html.

14 Ashley S. Deeks, Predicting Enemies, 104 Va. L. Rev. 1529 (2018).

15 Tomer Broude, Behavioral International Law, 163 U. Pa. L. Rev. 1099 (2014).

16 Anne van Aaken, Behavioral International Law and Economics, 55 Harv. Int'l L.J. 421 (2014).

17 Shiri Krebs, Predictive Technologies and Opaque Epistemology in Counterterrorism Decision-making, in 9/11 and the Rise of Global Anti-terrorism Law: How the Security Council Rules the World (Arianna Vedaschi & Kim Lane Scheppele eds., 2021).

18 Id.

19 In another publication, I shed light on some of the biases that affected the decision-making process that led to the targeted killing of Salah Shehade by Israeli forces, an operation that resulted in the death of thirteen civilians, eight of whom were children (including Shehade's thirteen-year-old daughter). Shiri Krebs, The Invisible Frames Affecting Wartime Investigations: Legal Epistemology, Metaphors, and Cognitive Biases, in International Law's Invisible Frames (Andrea Bianchi & Moshe Hirsch eds., 2021).

20 Moshe Hirsch, Cognitive Sociology, Social Cognition and Coping with Racial Discrimination in International Law, 30 Eur. J. Int'l L. 1319 (2019).

21 Tomer Broude & Inbar Levy, Outcome Bias and Expertise in Investigations Under International Humanitarian Law, 30 Eur. J. Int'l L. 1303 (2019).

22 Krebs, supra note 19.