News

Police use of Automatic Facial Recognition software found unlawful in the Court of Appeal

The case of Bridges, the Court of Appeal recently handed down the second judgment in what is thought to be the world’s first case on the lawfulness of Automatic Facial Recognition (AFR) when used by law enforcement agencies. It ruled that South Wales Police’s (SWP) already controversial use of the software breached privacy rights and broke equalities law.

Acting as a test for other forces in England and Wales, SWP rolled out AFR from June 2017 in public spaces in the Cardiff area. Over 500,000 faces were captured, saved and compared with those of people on a ‘watchlist’, ostensibly to apprehend suspects of crimes who the police could identify by facial image. However, the watchlist was not limited to this, but also included persons of “possible interest” to the police. The vast majority of faces captured belonged to people who were not suspected of any wrongdoing at all. If a person’s biometric data did not match an image on the watchlist, it would be automatically deleted from the system. The images were captured without the consent of the subjects. It was argued by the Claimant, Ed Bridges, that the use of AFR was indiscriminate and disproportionate.

After an initial unsuccessful challenge in the High Court, Bridges successfully appealed to the Court of Appeal. The Court held that SWP’s use of AFR was unlawful of three grounds: the insufficiency of its legal regime; the validity of the police force’s data protection impact assessment; and the failure to meet the public sector equality duty.

The Court ruled that the overall legal framework governing the deployment of AFR was inadequate, and the available policies and documents used by SWP did not answer key questions. The court stated that individual police officers are left with too much discretion as to who can be placed on the ‘watchlist’, and where AFR can be deployed. The court noted that the police had used wording in their privacy assessment that essentially allows them to target anybody at all who is of interest to them, including “vulnerable persons and other persons where intelligence is required” – barely any criteria were used to justify the addition of somebody to a watchlist.

It was further noted that the automatic and instantaneous deletion of anyone’s data who does not match a person on the watchlist should be a legal requirement.

Organisations in the public sector – i.e. all political, social and economic activity controlled by the state – have a duty to the public to uphold and defend principles of equality. This includes the requirement to eliminate unlawful discrimination and remove or minimise any disadvantages suffered by people due to their protected characteristics. For example, the police should take steps to ensure that people from ethnic minority backgrounds are not disproportionately affected or targeted by police activity, like stop and search.

Sadly, the Court found that the police did not do enough to fulfil this public sector equality duty in its deployment of AFR. The Court found that the potential for racial bias in AFR software raised a ‘serious issue of public concern’, risking disproportionately falsely identifying people from minority ethnic backgrounds as people on watchlists. It noted that the police did nothing to investigate this potential.

Indeed, concerns have been raised that darker skinned people are more likely to be falsely identified by AFR software as being the suspect of a crime, or at least to match somebody on a watchlist. Although few studies have been conducted in the UK as to whether AFR is biased on the lines of race and gender, a recent Guardian article gives further insight into the matter: “In 2018, a researcher at MIT’s Media Lab in the US concluded that software supplied by three [AFR] companies made mistakes in 21% to 35% of cases for darker-skinned women. By contrast, the error rate for light-skinned men was less than 1%.”

Ultimately, the Court was satisfied that AFR can be used by police forces, but that much more needs to be done to make its broader use fall within the bounds of the law. Criteria need to be used to justify somebody’s place on a watchlist, and reasons given as to why AFR is deployed at a certain place. Until then, this software remains to be another invasive tool of a state that time and again has been criticised for its draconian surveillance methods.

Sources:

https://www.theguardian.com/technology/2020/aug/11/south-wales-police-lose-landmark-facial-recognition-case

https://insights.doughtystreet.co.uk/post/102gdd2/polices-automated-facial-recognition-deployments-ruled-unlawful-by-the-court-of

    Close

    How can we help?

    Please fill in the form and we’ll get back to you as soon as we can





    We have partnered with Law Share from JMW Solicitors LLP to refer instructions and clients to them, when we are unable to act. By answering yes to this question, you agree that we may pass your details on to Law Share in such circumstances. You are under no obligation to instruct JMW Solicitors LLP after being referred. We may receive a payment from JMW Solicitors LLP further to this referral.