11.1 Introduction
Police use of facial recognition is on the rise across Europe and beyond.Footnote 1 Public authorities state that these powerful algorithmic systems could play a major role in assisting them to prevent terrorism, reduce crime, and to more quickly locate and safeguard vulnerable persons (online and offline).Footnote 2 There is also an international consensus among policymakers, industry, academia, and civil society, that these systems pose serious risks to the rule of law and several human rights integral to the existence of a democratic society.Footnote 3 These include the rights to private life, freedom of expression, freedom of assembly and association, and equality as guaranteed under the European Convention on Human Rights (ECHR).Footnote 4
In response to these ‘profound challenges’, policymakers and researchers have called for law reform that would provide greater clarity on the limits, lawfulness, and proportionality of facial recognition and other emerging AI-based biometric systems (such as gait and emotion recognition).Footnote 5 Consequently, some local and state governments in the United States have placed legal restrictions or banned law enforcement use of facial recognition technologies.Footnote 6 During the pre-legislative stages of the proposed EU AI Act, the European Parliament has also issued calls for a ban on the use of private facial recognition databases in law enforcement.Footnote 7 The world’s first case examining the legality of a facial recognition system deployed by police, Bridges v. South Wales Police, thus remains an important precedent for policymakers, courts, and scholars worldwide.Footnote 8
This chapter focusses on the role and influence of the right to private life, as enshrined in Article 8 ECHR and the relevant case law of the European Court of Human Rights (ECtHR), in the ‘lawfulness’ assessment of the police use of live facial recognition (LFR) in Bridges. A framework that the Court of Appeal for England and Wales ultimately held was ‘not in accordance with the law’ for the purposes of Article 8(2) and therefore in breach of Article 8 ECHR.Footnote 9 The analysis also considers the emerging policy discourse prompted by Bridges in the United Kingdom (UK) surrounding the need for new legislation.Footnote 10 This marks a significant shift away from the current AI governance approach of combining new ethical standards with existing law.Footnote 11
11.2 Facial Recognition Systems in Law Enforcement: Legal and Technical Issues
Within a legal context, the use by public authorities of any preventive measures that will indiscriminately capture and analyse the biometric data of a vast number of innocent individuals raises significant questions. These include whether rule of law requirements developed for ensuring adequate limits, safeguards, and oversight for police surveillance systems in the pre-Internet era, such as closed-circuit television (CCTV), remain adequate and relevant to facial recognition systems and other modern internet-enabled and automated monitoring systems; furthermore, whether such novel and powerful AI-based technologies are strictly necessary in a democratic society and respect the presumption of innocence.Footnote 12 Privacy and data protection concerns have also been raised regarding the transparency and oversight challenges posed by the increasing role of the private sector within the areas of law enforcement and public security. These developments range from law enforcement use of data-driven tools and systems developed by industry, including commercial facial recognition software,Footnote 13 to tasking industry itself with law enforcement functions.Footnote 14
11.2.1 Facial Recognition Systems in Law Enforcement: Issues of Accountability and Bias
The use of AI-based biometric systems for law enforcement purposes raises several legal and technical issues. First, there are transparency and accountability challenges that may hinder adequate independent auditing and oversight of their overall efficiency and societal impacts. These stem from the opaque design and operation of commercial facial recognition systems, including intellectual property issues, what training datasets are used, the risk of those datasets being unfairly biased, and how exactly these automated decisions and recommendations are being made (and fairly assessed) by public authorities.Footnote 15 The second concerns scientific evidence that facial recognition software currently designed and developed by industry, and subsequently used for law enforcement, is biased with a greater risk of false identifications (‘false positives’) for women and people from black, Asian, and other minority ethnic backgrounds.Footnote 16 There is then a risk that such groups may be disproportionately affected by this technology. This is particularly problematic given the need for public trust in police powers being used lawfully and responsibly and existing evidence of racial bias across the UK justice system (and indeed in other jurisdictions), with resulting harms including false arrests and over-policing of already vulnerable communities.Footnote 17
11.2.2 Facial Recognition Systems in Law Enforcement: ‘Real-Time’ and Historical
Police trials of industry-developed facial recognition systems have been taking place in the UK since 2014.Footnote 18 Automated facial recognition (AFR) implies that a machine-based system is used for the recognition either for the entire process or assistance is provided by a human being. Live automated one-to-many matching involves near real-time video images of individuals with a curated watchlist of facial images. In a law enforcement context, this is typically used to assist the recognition of persons of interest on a watchlist, which means that police are required to verify or over-ride a possible match identified by the system (a system alert) and decide what actions to take (if any).Footnote 19 However, as regulators and scholars highlight, much uncertainty in UK law (and in laws across Europe) surrounds the complex legal framework governing police use of real-time access and historical (retrospective/post-event) of LFR and other biometric identification systems.Footnote 20
In the case of historical (post-event) facial recognition systems, individual’s facial data are compared and identified in searches by public authorities after the event with images previously collected through various sources. These include custody photographs and video footage from CCTV, body-worn police cameras, or other private devices. Both the ECtHR and the Court of Justice of the EU (CJEU) view real-time access to these automated biometric systems, as opposed to searching through previously collected facial images, as inherently more invasive.Footnote 21 Yet these judgments do not explain why tracking a person’s movements or attendance at certain events (such as public protests) over months or years should be viewed as less invasive of their privacy than one instance of real-time identification, particularly given the capacity of these automated systems to identify thousands of individuals using facial images in only a few hours.Footnote 22
Such legal uncertainty would be less likely if these issues had already been addressed in a clear legislative framework regulating law enforcement use of facial recognition systems. As the UK House of Lords rightly points out, while they play ‘an essential role in addressing breaches of the law, we cannot expect the Courts to set the framework for the deployment of new technologies’.Footnote 23 In other words, it is not the function of courts to provide a detailed and comprehensive legal framework for police powers, though they may provide careful scrutiny of the current law and its application in specific circumstances. This brings us to Article 8 ECHR and its relevance to the landmark case of Bridges where police use of LFR was (ultimately) held not to have met the legality requirements of Article 8(2).
11.3 Justifying an Interference with Article 8 ECHR
11.3.1 Police Collection of Biometric Data: An Interference with Article 8(1)
Th ECtHR has described the negative obligation to protect against arbitrary interference by a public authority with a person’s private life ‘as the essential object’ of Article 8 ECHR.Footnote 24 It is also well-established case law that the mere storage of data ‘relating to the private life of an individual’ for the prevention of crime constitutes an interference with the right to respect for private life.Footnote 25 The ECtHR Grand Chamber has further held that it is irrelevant if this information collected by interception or other secret measures has not been subsequently accessed, used, or disclosed.Footnote 26 Public information has also been held to fall within the scope of private life when it is systematically collected and stored by public authorities.Footnote 27
In determining whether the retention of this personal data involves any ‘private-life’ aspects, the ECtHR will have due regard to the specific context in which the information has been recorded and retained, the nature of the records, the way in which these records are used and processed, and the results that may be obtained.Footnote 28 These standards all derive from the long-established principle in ECHR case law that ‘private life’ is ‘a broad term not susceptible to exhaustive definition’.Footnote 29 As a result, this concept has been interpreted broadly by the Strasbourg Court in cases involving Article 8 ECHR and any data collection, retention, or use by public authorities in a law enforcement context. Even if no physical intrusion into a private place occurs, surveillance can still interfere with physical and psychological integrity and the right to respect for private life. For instance, in Zakharov v. Russia, the ECtHR Grand Chamber held Russia laws providing security agencies and police remote direct access to the databases of mobile phone providers to track users contained several ‘defects’ owing to a lack of adequate safeguards to ensure against abuse, thereby constituting a breach of Article 8 ECHR.Footnote 30
The ECtHR has also shown itself to be particularly sensitive to the ‘automated processing’ of personal data and the unique level of intrusiveness on the right to private life posed by the retention and analysis of biometric data for law enforcement purposes, particularly DNA.Footnote 31 Biometric data (DNA, fingerprints, facial images) are a highly sensitive source of personal data because they are unique to identifying an individual and may also be used to reveal other sensitive information about an individual, their relatives, or related communities, including their health or ethnicity. Consequently, the ECtHR has held that even the capacity of DNA profiles to provide a means of ‘identifying genetic relationships between individuals’ for policing purposes thus amounts to a privacy interference of a ‘highly sensitive nature’ and requires ‘very strict controls’.Footnote 32
At the time of writing, there has been no judgment to date in which the ECtHR has been required to specifically review the compatibility of police use of a LFR system with Article 8 ECHR. This is surely, however, an important question on the horizon for the Strasbourg Court, particularly as the technology has already featured in the legal analysis of related case law. In Gaughran v. United Kingdom, the ECtHR highlighted as a factor the possibility that the police ‘may also apply facial recognition and facial mapping techniques’ to the taking and retention of a custody photograph taken on the applicant’s arrest in its determination that this clearly amounted an interference with Article 8(1).Footnote 33 Current jurisprudence therefore leaves little doubt that the collection, retention, or analysis of an individual’s facial image for the prevention of crime (irrespective of where or how it was acquired) amounts to an interference with the right to private life, as guaranteed under Article 8 ECHR.
11.3.2 The Legality Requirements under Article 8(2): The Traditional Approach
Under the traditional approach of the ECtHR in its assessment of whether an interference with Article 8(1) is justified, there is a two-stage test. First, as noted earlier, the ECtHR assesses whether the complaint falls within the scope of Article 8(1) and whether the alleged interference by the contracting state (such as the UK) has engaged Article 8(1). If so, the ECtHR will then examine whether the interference with one of the protected interests in Article 8(1) (in this instance, ‘private life’) meets the conditions of Article 8(2). The three conditions examined during this second stage concern whether the interference is ‘in accordance with the law’ (legality), pursues one of the broadly framed legitimate aims under Article 8(2) (including the prevention of crime), and whether it is ‘necessary in a democratic society’ (proportionality). If a measure is determined not to have satisfied the requirements of the legality condition, the ECtHR will not proceed to examine the proportionality condition.Footnote 34
The traditional approach of the ECtHR, when determining if an interference meets the legality condition under Article 8(2), requires that the contested measure satisfy two principles. The measure must have ‘some basis in domestic law’ and, secondly, must also comply with the rule of law.Footnote 35 In its early jurisprudence, the ECtHR established that the principle of having some basis in domestic law comprises legislation and judgments.Footnote 36 The second principle focusses on the ‘quality’ of the domestic law, which involves meeting the tests of ‘accessibility’ and ‘foreseeability’.Footnote 37 As police operation and use of surveillance measures by their very nature are not open to full scrutiny by those affected or the wider public, the ECtHR has stated that it would be ‘contrary to the rule of law for the legal discretion granted to the executive or to a judge to be expressed in terms of an unfettered power’.Footnote 38
Thus, as part of the Article 8(2) foreseeability test, the ECtHR developed six ‘minimum’ safeguards the basis in domestic law should address to avoid abuses of power in the use of secret surveillance. These comprise: the nature of the offences where the measure may be applied; a definition of the categories of people that may be subjected to this measure; a limit on the duration of the measure; the procedures to be followed for the examination, use, storage of the obtained data; precautions to be taken if data is shared with other parties; and the circumstances in which obtained data should be erased or destroyed.Footnote 39 With regard to police use of emerging technologies, the ECtHR has consistently held that such measures ‘must be based on a law that is particularly precise … especially as the technology available for use is continually becoming more sophisticated’.Footnote 40 The ECtHR has further stressed, in cases where biometrics have been retained for policing purposes, that the need for data protection safeguards is ‘all the greater’ where ‘automatic processing’ is concerned.Footnote 41
This traditional approach, and the resulting legality standards developed and applied therein in landmark Article 8 ECHR judgments, have shaped and brought about notable legal reforms in domestic laws governing data retention and secret surveillance by public authorities across Europe.Footnote 42 Scholars have long recognised this impact by highlighting the major role played by this Article 8 ECHR jurisprudence in entrenching and ratcheting up data privacy standards in EU countries and within the legal system of the EU.Footnote 43 Based on these Article 8 ECHR standards, these minimum legality requirements seem no less than essential to ensuring adequate accountability and oversight of police surveillance powers. Indeed, as the ECtHR points out, this is an area ‘where abuse is potentially so easy in individual cases and could have such harmful consequences for democratic society as a whole’.Footnote 44 However, more recent case law dealing with Article 8 ECHR and the legality of police powers has diverged from this lauded approach.
11.3.3 The Legality Requirements under Article 8(2): The à la carte Approach
Two key developments in its jurisprudence have contributed to the departure of the ECtHR from its previously lauded role for setting minimum standards in the review of laws governing government surveillance and police investigatory powers across Europe.
11.3.3.1 The Hierarchy of Intrusiveness
First, the ECtHR has established that the scope of the safeguards required to meet legality requirements under Article 8(2) will depend on the nature and extent of the interference with the right to private life.Footnote 45 This means that the ECtHR will not apply the same strict-scrutiny approach regarding what requirements must be met by interferences it considers to be less intrusive and thus affect an individual’s rights under Article 8(1) less seriously.Footnote 46 Accordingly, the ECtHR may assess a measure to be justified interference with Article 8 ECHR even if the domestic legal basis does not incorporate the six minimum foreseeability safeguards.Footnote 47 Application of this ‘hierarchy of intrusiveness’ principle is clearly evident in the general legality assessments of the High Court and Court of Appeal in Bridges discussed in Section 11.4.
11.3.3.2 The Joint Analysis of Legality and Proportionality
Secondly, and perhaps more importantly, scholars have raised concerns regarding a shift away from the traditional approach of the ECtHR in its Article 8 ECHR case law dealing with data retention and state surveillance. This often takes the form of an assessment that combines the legality and proportionality conditions under Article 8(2) and conflates separate principles and requirements under the distinct conditions of legality and proportionality.Footnote 48 From a rule of law perspective, this shift away from the traditional approach to the Article 8(2) stage of assessment is highly problematic as it makes less systematic and clear what is already a case-by-case analysis by the ECtHR. The resulting assessment of the domestic law is often ad hoc, patchy, and invariably less detailed regarding what specific standards contracting states should be satisfying if a contested measure is to be considered compatible with Article 8 ECHR.
Thus, as Murphy rightly notes, this joint analysis has resulted in the ECtHR applying less scrutiny of the accessibility and foreseeability legality tests, thereby serving to weaken the substantive protection of the right to respect for private life provided under Article 8 ECHR.Footnote 49 Indeed, the ECtHR may also determine (without any detailed reasoning) that no rule of law assessment at all be undertaken and that the Article 8(2) stage assessment proceed directly to an examination of the proportionality condition. Catt v. United Kingdom illustrates the application of this à la carte approach to the requirements of Article 8(2), where the legality condition assessment is entirely omitted despite being the core issue before the ECtHR.
11.3.3.3 Catt v. United Kingdom: The Danger of Ambiguous Common Law Police Powers
The main facts in Catt involve the overt collection and subsequent retention of more than sixty records (including a photograph) on an ‘Extremism database’ concerning the applicant’s attendance at protests between 2005 and 2009. The applicant was never charged or accused of any violent conduct as part of these protests.Footnote 50 An instrumental factor in Catt and Bridges is the broad scope of the ‘common law’ in England and Wales, which allowed for the police collection and storage of information in both cases.Footnote 51 Based on the undefined scope of these police powers, and the lack of clarity regarding what fell within the concept of ‘domestic extremism’, the ECtHR in Catt states that there was ‘significant ambiguity over the criteria being used by the police to govern the collection of the data in question’.Footnote 52 A year later, in Bridges, the Court of Appeal would also criticise the same lack of clarity surrounding the criteria and limits underpinning the use of LFR by South Wales Police (SWP).
The Article 8(2) assessment in Catt then takes a curious turn. Following a bald statement that the question of whether the collection, retention, and use of the applicant’s personal data is in accordance with the law is ‘closely related to the broader issue of whether the interference was necessary in a democratic society’, the ECtHR observes that it is not necessary for the legality condition to be examined.Footnote 53 The ECtHR proceeds to then hold that the retention of the applicant’s personal data on this police database, and the fact that this retention occurred based on no ‘particular inquiry’, constituted a disproportionate interference with Article 8 ECHR.Footnote 54 The ECtHR was particularly critical that the applicant’s personal data in Catt could potentially have been retained indefinitely owing to ‘the absence of any rules setting a definitive maximum time limit on the retention of such data’.Footnote 55 The ECtHR further observes that the applicant was ‘entirely reliant’ on the application of ‘highly flexible safeguards’ in non-legally binding guidance to ensure the proportionate retention of his data.Footnote 56 In other words, as Woods rightly points out, this is ‘hardly a ringing endorsement of broad common law powers’.Footnote 57
However, despite its recognition of the ‘danger’ posed by the ambiguous approach to the scope of data collection under common law police powers,Footnote 58 the ECtHR sidesteps dealing with the lack of any clear legal basis or any assessment of the six minimum foreseeability safeguards. By departing from the traditional approach in its assessment of Article 8 ECHR, the ECtHR stops short of any detailed scrutiny of these requirements under the legality condition of Article 8(2). This allows the ECtHR to avoid addressing whether the ‘common law’ basis for police collection of personal data in the UK provides the ‘minimum degree of legal protection’ to which citizens are entitled under the rule of law in a democratic society.Footnote 59 Indeed, the curious decision of the ECtHR not to deal with these clear legality issues, and the resulting lax approach, is subject to strong criticism from members of the Strasbourg Court itself in Catt.Footnote 60 The latter stressed that the unresolved ‘quality of law’ questions posed by the contested common law police powers is actually ‘where the crux of the case lies’.Footnote 61 This à la carte approach to the rule of law requirements in Catt is also clearly evident in the assessment of the LFR system by the national courts in Bridges, examined in Section 11.4.
11.4 Bridges v. South Wales Police: The ‘Lawfulness’ of AFR Locate
11.4.1 Background and Claimant’s Arguments
This landmark case involves two rulings, the most significant being the Court of Appeal judgment delivered in 2020.Footnote 62 The claimant/appellant was Edward Bridges, a civil liberties campaigner who lived in Cardiff. His claim was supported by Liberty, an independent civil liberties organisation. The defendant was the Chief Constable of SWP. SWP is the national lead on the use of AFR in policing in the UK and has been conducting trials of the technology since 2017.Footnote 63 The software used by SWP for LFR in public places was developed by NEC (now North Gate Public Services (UK) Ltd).Footnote 64 In Bridges, AFR Locate was deployed by SWP via a live feed from CCTV cameras to match any facial images and biometrics with watchlists compiled from existing custody photographs. SWP would be alerted to a possible match by the software (subject to meeting a threshold level set by SWP) and the police would verify the match, determining whether any further action was required, such as making an arrest, if the match was confirmed.Footnote 65
Mr Bridges challenged the lawfulness of SWP’s use of the AFR Locate system in general, and made a specific complaint regarding two occasions when his image (he argued) was captured by the system. The first occasion was in a busy shopping area in December 2017, the second at a protest attended by the claimant in March 2018.Footnote 66 Regarding the legality requirements of Article 8 ECHR and use of this LFR system by SWP, the claimant submitted two main arguments. First, there is ‘no legal basis’ for the use of AFR Locate and thus SWP did not, as a matter of law, have power to deploy it (or any other use of AFR technology). Secondly, even if it was determined that some domestic basis in law existed, it was not ‘sufficient’ to be capable of constituting a justified interference under Article 8(2).Footnote 67 This contrasts with legal provisions under the Police and Criminal Evidence Act 1984 and its related Code of Practice, which specifically state the circumstances that apply to police collection and use of DNA and fingerprints.Footnote 68
The claimant submitted that to satisfy the legality condition of Article 8(2) there must be a legal framework that specifies the following five safeguards. First, the law should specify the circumstances and limits by which AFR Locate may be deployed, such as only when there is ‘reasonable suspicion’ or a ‘real possibility’ that persons who are sought may be in the location where AFR Locate is deployed. Secondly, the law should place limits on where AFR Locate may be deployed. Thirdly, the law should specify the ‘classes of people’ that may be placed on a watchlist, further arguing that this be limited to ‘serious criminals at large’. Fourthly, the law should state the sources from where images included in watchlists may be obtained. Finally, the law should provide ‘clear rules relating to biometric data obtained through use of AFR Locate’. This should include how long it may be retained and the purposes for which such information may (or may not) be used.Footnote 69
The claimant thus challenged the absence of any accessible or foreseeable legal framework (in legislation or any related Code of Practice) that explicitly and clearly regulates the obtaining and use of AFR technology by SWP (or any police force) in England and Wales. In her role as an intervener before the High Court in Bridges, the then Information Commissioner (the statutory regulator of UK data protection law) made similar arguments. While she did not seek to limit the categories of persons who might be included on watchlists, her submission was that the ‘categories of who could be included on a watchlist needed to be specified by law’. She also submitted that the purposes for which AFR Locate could be used should be specified in law. Finally, she argued that any use of AFR Locate, and any decision as to who should be included on a watchlist, needed to be the subject of ‘independent authorisation’.Footnote 70
11.4.2 Police Collection, Use, Retention of a Facial Image: An Interference with Article 8(1)
Both the High Court and the Court of Appeal engage in detail, and at length, with the Article 8 ECHR case law of the ECtHR in their assessments that SWP use of AFR Locate amounted to an infringement with the Article 8(1) rights of the applicant. As the High Court states: ‘Like fingerprints and DNA, AFR technology enables the extraction of unique information and identifiers about an individual allowing his or her identification with precision in a wide range of circumstances. Taken alone or together with other recorded metadata, AFR-derived biometric data is an important source of personal information.’Footnote 71 This determination is unsurprising for two reasons.
First, as noted earlier, the ECtHR has consistently held that the collection and use of biometric data using automated processing for police purposes constitutes an interference with Article 8(1). Secondly (and perhaps more importantly), none of the parties contested that use of the AFR Locate system by SWP constitutes an interference with Article 8(1).Footnote 72 Nevertheless, as the first judgment worldwide to hold that a police force’s use of LFR constituted an interference with Article 8 ECHR, this assessment in Bridges represents an important legal precedent in European human rights law and international human rights law.
11.4.3 Was SWP Deployment of AFR Locate ‘In Accordance with the Law’ under Article 8(2)?
11.4.3.1 High Court Finds Common Law Powers ‘Amply Sufficient’: No Breach of Article 8 ECHR
With respect to there being a lack of a specific statutory legal basis for SWP’s use of LFR, SWP and the Secretary of State submitted to the High Court that the police’s common law powers constituted ‘sufficient authority for use of this equipment’.Footnote 73 The High Court accepted this argument. In its reasoning, the High Court cited at length previous caselaw where the extent of the police’s common law powers has generally been expressed in very broad terms. In particular, the High Court relied heavily on the controversial majority verdict in the UK Supreme Court case of Catt.Footnote 74 The High Court gave considerable weight to a specific passage by Lord Sumption JSC who states in Catt that at ‘common law the police have the power to obtain and store information for policing purposes … [provided such] powers do not authorise intrusive methods of obtaining information, such as entry onto private property or acts … which would constitute an assault’.Footnote 75
The High Court then observed that the ‘only issue’ for it then to consider is whether using CCTV cameras fitted with AFR technology to obtain the biometric data of members of the public in public amounts to an ‘intrusive method’ of obtaining information as described by Lord Sumption JSC in Catt. Observing that the AFR Locate system method of obtaining an image ‘is no more intrusive than the use of CCTV in the streets’, the High Court held that such data collection did not fall outside the scope of police powers available to them at common law.Footnote 76 Regarding the use of watchlists within the AFR Locate system, the High Court swiftly concluded that as the relevant images were acquired by way of police photography of arrested persons in custody, the police already have explicit statutory powers to acquire, retain, and use such imagery under the Police and Criminal Evidence Act 1984.Footnote 77 The High Court also took no issue with the ambiguity of the broadly-framed scope for watchlists that may cover any ‘persons of interest’ to the police. The grounds for such reasoning being that the compilation of any watchlists ‘is well within the common law powers of the police … namely “all steps necessary for keeping the peace, for preventing crime or for protecting property”’.Footnote 78
The High Court briefly refers to the general requirements of accessibility and foreseeability, but there is no mention (or any engagement with) the six minimum safeguards implicitly raised in the claimant’s submission on legality. Instead, the court distinguishes the need for AFR Locate to have ‘detailed rules’ or any independent oversight to govern the scope and application of police retention and use of biometrics (as set out in the ECtHR jurisprudence) on two grounds. First, that facial recognition is ‘qualitatively different’ from the police retention of DNA that provides access to a very wide range of information about a person and, secondly, it is not a form of covert (or secret) surveillance akin to communications interception.Footnote 79 In addition to the common law, the High Court stresses that the legal framework comprises three layers, namely existing primary legislation, codes of practice, and SWP’s own local policies, which it considered to be ‘sufficiently foreseeable and accessible’.Footnote 80
In dismissing the claimant’s judicial review on all grounds, the High Court held the legal regime was adequate ‘to ensure the appropriate and non-arbitrary use of AFR Locate’, and that SWP’s use to date of AFR Locate satisfied the requirements of the UK Human Rights Act 1998 and data protection legislation.Footnote 81
11.4.3.2 ‘Fundamental Deficiencies’ in the Law: Court of Appeal Holds Breach of Article 8 ECHR
In stark contrast to the High Court judgment, the Court of Appeal held the use of the AFR Locate system by SWP to have breached the right to respect for private life, as protected under Article 8 ECHR of the UK Human Rights Act 1998, because of ‘two critical defects’ in the legal framework that leave too much discretion to individual officers.Footnote 82 The Court of Appeal highlights that the guidance (not legally binding) in the Surveillance Camera Code of Practice 2013 did not contain any requirements as to the content of local police policies as to who can be put on a watchlist. Nor does it contain any guidance as to what local policies should contain ‘as to where AFR can be deployed’.Footnote 83
The Court of Appeal further criticised the fact that SWP’s local policies did ‘not govern who could be put on a watchlist in the first place … [and] leave the question of the location simply to the discretion of individual police officers’.Footnote 84 Thus, the Court of Appeal took issue with ‘fundamental deficiencies’ of the legal framework relating to two areas of concern, namely two safeguards from the established ECtHR Article 8 ECHR case law on the six minimum foreseeability safeguards: ‘The first is what was called the “who question” at the hearing before us. The second is the “where question” … In relation to both of those questions too much discretion is currently left to individual police officers.’Footnote 85
11.4.4 Beyond Bridges: Moves towards Regulating Police Use of Facial Recognition?
The Court of Appeal judgment represents a clear departure from the legality assessment of the High Court, particularly its determination that SWP’s use of LFR does not satisfy the requirement of Article 8 ECHR (via the UK Human Rights Act 1998) of being ‘in accordance with the law’. This assessment was long-awaited by civil society and scholars who had consistently raised concerns that police deployment in England and Wales of LFR trials risked being assessed as unlawful if challenged before the courts. Two key issues were the lack of a specific legal basis authorising police use of AFR and a lack of clarity regarding the foreseeability of the applicable circumstances and safeguards by which police services across England and Wales are lawfully permitted to use these automated systems.Footnote 86 Indeed, the former Biometrics Commissioner observed in his 2017 Annual Report that the development and deployment of automated biometric systems in use by police at that time was already ‘running ahead of legislation’.Footnote 87
The Court of Appeal judgment in Bridges thus provides some clarity regarding the ‘deficiencies’ to be addressed by the current legal framework applied specifically by SWP and its deployment of a specific LFR system. Critically, however, the Court of Appeal also states that Bridges is ‘not concerned with possible use of AFR in the future on a national basis’, only the local deployment of AFR within the area of SWP.Footnote 88 Thus, the legality of police use of facial recognition systems (real-time and post-event) across the UK remains a subject of intense debate. Over 80,000 people have signed a petition (organised by UK-based non-governmental organisation Liberty) calling on the UK Government to ban all use of facial recognition in public spaces.Footnote 89 In 2022, a House of Lords report and a review on the governance of biometrics in England and Wales (commissioned by the Ada Lovelace Institute) both called for legislation that would provide greater clarity on the use, limits, and safeguards governing facial recognition and other AI-based biometric systems.Footnote 90
In the wake of the Bridges case, the Biometrics and Surveillance Camera Commissioner (BSCC) for England and Wales and civil society has also highlighted concerns regarding wide-ranging guidance from the College of Policing,Footnote 91 which gives police services considerable discretion regarding the criteria of persons who may be placed on a watchlist for LFR use. The Commissioner has noted that the broad and general scope of such guidance means LFR is not limited to the identification of suspects but may even include potential victims on such watchlists, providing police with a level of discretion that has ‘profound’ implications for constitutional freedoms.Footnote 92 A Data Protection Impact Assessment (DPIA) published by SWP concerning their use of LFR confirms the broad criteria for those persons that may be placed on a watchlist, including witnesses and persons ‘who are or may be victims of a criminal offence’.Footnote 93
It is also important to stress that the standards set out by the College of Policing APP are not legally binding. They also do not constitute a statutory code of practice. In direct reference to its legal context, the APP states that its function is to provide ‘direction to [police] forces that will enable them to ensure that their deployment of overt LFR [complies] with applicable legal requirements’.Footnote 94 An important caveat, however, for police services across England and Wales immediately follows. The APP implicitly acknowledges that such guidance is insufficient in of itself to ensure the lawfulness of LFR and proceeds to specifically advise police that they should obtain ‘expert legal advice’ to support their use of these systems.Footnote 95
In terms of developing the foreseeability safeguards as part of the legality requirements of Article 8(2) ECHR, it is submitted that legislation should require law enforcement authorities using facial recognition systems to make publicly available the ‘threshold value’ being applied by public authorities when using these systems. Where the system has been acquired from the private sector, this information should also explain if (and why) public authorities have chosen to depart from the default threshold value set by the company that has provided any facial recognition system(s) to public authorities. Independent scientific research examining the facial recognition systems being used by SWP, and the Metropolitan Police Service,Footnote 96 has specifically stated that false positive identifications ‘increase at lower face-match thresholds and start to show a statistically significant imbalance between demographics with more Black subjects having a false positive than Asian or White subjects’.Footnote 97 Thus, using a system with a lower threshold value increases the number of matching results but also increases the risk of unfair bias against certain societal groups by law enforcement, and should consequently be accompanied by the necessary justification and safeguards. Such information should also be shared in Data Protection Impact Assessments in order to alert regulators (and other independent oversight bodies) of the increased risk of bias posed towards certain groups and the safeguards being adopted by public authorities to address and mitigate these risks.
11.5 Conclusions
While some valuable guidance has been provided by the Court of Appeal in Bridges, which draws (albeit in a limited way) on the lauded legality case law of the ECtHR dealing with Article 8 ECHR and police investigatory powers, the current patchwork of law governing police use of facial recognition in the UK falls short of lawful and trustworthy public policy.
As the UK House of Lords rightly points out, it is not for the courts to set the framework for the deployment of new technologies. This chapter argues that the reasoning for this is threefold. First, court judgments are not systematic, comprehensive, or evidence based. Secondly, they represent ad hoc reviews of problematic public policymaking that only occur when (and if) a legal challenge is brought before them. Thirdly, the courts will assess only a narrow scope of issues relevant to that specific case. The implications posed by the lack of an accessible and foreseeable framework for police use of AFR in the UK are significant. This gap represents a source of confusion and legal uncertainty for policymakers, police, industry, courts, and citizens, thereby giving rise to gaps and patchy protection of affected rights and safeguards, including but not limited to the right to private life. These all serve to undermine adequate and effective compliance, oversight, evaluation, and thus public trust in the use of these novel and increasingly sophisticated police powers.
There is, however, a post-Bridges discourse on lawfulness that has moved towards enacting a law specifically tailored to regulating use of facial recognition. Such reform could address the current obscurity and uncertainty in the current patchwork of legal rules in England and Wales governing the limits and safeguards underpinning police use of facial recognition, particularly the compilation and application of watchlists. This legislation could then meet the accessibility and foreseeability tests under the legality condition of Article 8 ECHR, the ‘minimum degree of legal protection’ to which citizens are entitled under the rule of law in a democratic society.Footnote 98 Such reform would also enable ‘greater certainty and accountability’ around police use of AI-based biometric surveillance systems and other emerging technologies.Footnote 99