Faceoff! UK appellate court finds police use of facial recognition technology contravenes laws

Outcome of case is likely to be far-reaching, including in Canada, writes Lisa Lifshitz

Lisa R. Lifshitz

In what has been described as one of the world's first successful legal challenges to police use of facial recognition technology, on Aug. 11 the UK Court of Appeal (Civil Division) found in R. (Bridges) v. CC South Wales & ors [2020] EWCA Civ 1058 that aspects of such use violated several laws, including the right to “private and family life” under Article 8 of the European Convention on Human Rights (the “Convention”), the Data Protection Act 2018 (“DPA 2018”) and the Public Sector Equality Duty (“PSED”) in section 149 of the Equality Act 2010.

This landmark decision concerned the lawfulness of the use of automated facial recognition technology (“AFR”) by the South Wales Police Force (“SWP”) in an ongoing trial project of a system called AFR Locate. This system deploys CCTV surveillance cameras to capture digital images of faces, processed to extract facial biometric information which is then compared with that of persons on a watch list compiled by SWP. When two facial images are compared the AFR software generates a “similarity score,” a numerical value indicating the likelihood that the faces match.

If no match is detected, AFR Locate will automatically delete the facial image captured from the live feed. If the software identifies a possible match, the two images are reviewed by an AFR operator (a police officer), and if s/he does not believe a subject of interest has been identified no further action is taken. However, if the AFR operator determines there is a match, other officers stationed nearby may intervene, for example, by asking to speak to the person concerned and even to stop-and- search or arrest the individual.

AFR Locate is capable of scanning 50 faces per second and there was no limit on the number of persons whose facial biometrics were captured during any given deployment. On the facts, the Court found that over the 50 deployments that were undertaken in 2017 and 2018, around 500,000 faces may have been scanned, with the overwhelming majority of persons whose biometrics were captured using AFR Locate not suspected of any wrongdoing and not of interest to the police.

The SWP was obliged to and did make the public aware of its use of AFR technology, whether at events or in the area in question, and there was material about AFR on SWP’s website. Notwithstanding this, the Court found it reasonable to suppose that a large number of people were ignorant that their facial biometrics were being captured and processed by SWP.

The appellant, Edward Bridges, lived in Cardiff. While challenging the lawfulness of SWP’s use of AFR Locate generally, he was specifically unhappy about two particular deployments of AFR Locate by the SWP which caught him on camera. Bridges contended that on the first occasion he did not see signage and was given no other warning that AFR was in use prior to his being in close proximity to AFR-equipped vans that recorded his image; and, on the second occasion, he was not aware that AFR was in use and did not see any information about the use of AFR. SWP did not contest this.

Bridges brought a claim for judicial review on the basis that AFR (i) was not compatible with the right to respect for private life under Article 8 of the Convention and also breached Articles 10 and 11; (ii) breached the DPA 1998, and (iii) breached the PSED. In September 2019 the UK High Court of Justice dismissed Bridges’ claim for judicial review on all grounds, and found that any interference with his privacy rights was found to be in accordance with law and proportionate.

Bridges appealed the decision on five grounds. While the Court reaffirmed that the use of AFR was a proportionate interference with human rights, it reversed the earlier High Court’s decision in a number of areas, finding favour on Bridges’ behalf on Grounds 1, 3 and 5.

First, the Court found that the High Court erred in concluding that SWP’s interference with Bridges’ Article 8(1) rights was “in accordance with the law” for the purposes of Article 8(2). The Court held that although there was a legal framework comprising primary legislation (DPA 2018), secondary legislation (The Surveillance Camera Code of Practice), and local policies promulgated by SWP, there was no clear guidance as to who could be placed on a watch list, nor evident criteria for determining where AFR could be deployed. Accordingly, the Court held that too much discretion was given to individual police officers to meet the standard required by Article 8(2).

Bridges succeeded on Ground 3 relating to flaws in the data protection impact assessment (DPIA) required by Section 64 of the DPA 2018. The DPIA conducted by the SWP failed properly to assess the risks to the rights and freedoms of data subjects and failed to address the risks arising from the deficiencies. The Court found that, as the DPIA was written on the basis that Article 8 was not infringed despite the wide discretion of police, the selection of those on the watch lists and the locations where AFR may be deployed, the DPIA was deficient.

Finally, the appeal succeeded on Ground 5, that the High Court was wrong to hold that SWP complied with the PSED. Bridges alleged that the SWP erred by not investigating whether the use of AFR led to discrimination on the basis of race and sex before deploying it, since there is scientific evidence that facial recognition software can be biased and create a greater risk of false identifications of those from Black, Asian and other minority racial backgrounds, and of women.

While it was not alleged that the software used by SWP had this effect, the complaint was based on an alleged breach of the positive duty of public authorities, including police, to have due regard to the need to eliminate such discrimination. The Court found that as AFR is a novel and controversial technology, the SWP did not do all they reasonably could to fulfill the PSED and ensure that the software used did not have a racial or sexual bias.

While the ruling did not ban the use of AFR in the UK, the Court granted declaratory relief to reflect its findings and confirmed that UK law enforcement agencies have specific obligations to meet in order to use AFR. The SWP has confirmed it will not appeal the judgment, which means that other UK police using facial recognition technology (including the London Metropolitan Police, which recently deployed a similar system) must meet the standards set out in the Court’s ruling.

The outcome of the case on the use of this technology by police is likely to be far-reaching, including in Canada where gaps in our legal framework concerning biometrics have allowed law enforcement and intelligence agencies to use facial recognition technology freely.

In January the Royal Canadian Mounted Police denied using Clearview AI’s facial recognition technology (powered by billions of photos scraped from the internet) despite months of use, though it admitted to doing so the following month. The Ontario Provincial Police and the Toronto, Hamilton, Edmonton and Calgary police forces have also admitted to using the software.

Extensive media coverage regarding the use of such facial recognition technology by police and the resulting public outcry no doubt spurred a joint investigation of Clearview AI by the Office of the Privacy Commissioner of Canada (OPC) and its provincial counterparts in Quebec, British Columbia and Alberta to determine whether the company’s practice of collecting and using images without consent complies with federal and provincial privacy legislation. In July the company advised Canadian privacy regulators that it would cease offering its facial recognition services in Canada, including indefinitely suspending its contract with the RCMP, and two days later it announced it was quitting Canada altogether. The OPC’s regulatory investigation continues nonetheless, including its investigation into the RCMP’s use of this technology.

In the absence of Canadian jurisprudence and specific regulations, the OPC and provincial privacy regulators have agreed to work together to develop meaningful guidance for organizations — including law enforcement — on the use of biometric technology. Cases such as R. (Bridges) v. CC South Wales & ors give additional credence to the critical need to balance privacy, human rights and public transparency, and to prevent bias and discrimination when law enforcement agencies consider the deployment of this powerful technology.

Recent articles & video

Last few days to nominate in the Top 25 Most Influential Lawyers

Why this documentarian profiled elder rights advocate Melissa Miller in Hot Docs film Stolen Time

Saskatchewan government boosts practical learning at University of Saskatchewan College of Law

BC Supreme Court clarifies the scope of solicitor-client privilege in estate administration

Federal Courts invite public feedback on the conduct of a global review of its rules

BC proposes legislative changes to support First Nations land ownership

Most Read Articles

National Bank cannot fulfill Greek bank’s credit guarantee due to fraud exception: SCC

Canada facing pervasive ransomware, broader cyber-criminal landscape and threat from AI: lawyer

Ontario Court of Appeal rules against real estate developer for breach of a joint venture agreement

Canadian Lawyer partners with legal associations to survey legal graduates