High Court Finds Police use of Automated Facial Recognition Technology is Lawful
On 4 September 2019, the High Court handed down judgment in the judicial review case of R(Edward Bridges) v The Chief Constable of South Wales Police. The case concerned a challenge to the lawfulness of the use of Automated Facial Recognition technology (AFR) by South Wales police.
'AFR Locate' is a pilot project that South Wales police have been running since 2017 and uses overt camera surveillance of members of the public, which on two occasions likely included the claimant in this case. The images captured are then processed using AFR software to compare them with police images of particular people who are on watchlists compiled by the police. The watchlists compiled for the instances in which AFR had likely captured the claimant included individuals who were suspected of having committed a criminal offence, individuals wanted on a warrant, and individuals who had previously been arrested at one of the events the year before. The claimant was not on any of these watchlists.
The judgment noted that South Wales police had used AFR Locate approximately 50 times between May 2017 and April 2019.
The AFR technology used by South Wales police takes precise measurements of the facial features of the members of the public who are captured on the camera footage where AFR Locate is deployed. These measurements are unique identifiers similar to a fingerprint or an extract of DNA and is known as biometric data. The court found that this data is an important source of personal information and the obtaining and processing of such information using AFR did engage the claimant's Article 8 right to respect for his private life, which is protected by the Convention.
However, the court went on to find that this interference with the claimant's Article 8 rights was justified in law by the police's common law powers to prevent and detect crime and, as there was no physical intrusion on individual rights, no express statutory powers were required for the police to use AFR. The court identified a satisfactory legal framework in the Data Protection Act 2018, relevant codes of practice, and the specific policies of South Wales police to provide the legal certainty and forseeability required for AFR to justify the interference with the Article 8 rights of members of the public.
The court further found that the use of AFR did engage the processing of sensitive information for the purposes of the Data Protection Act 2018 but that the requirements of s.35(5) of the legislation were met so that the police use of AFR did not breach the claimant's rights, although they noted that South Wales police's own policy may need further drafting as the current version was not clear enough as to how compliance with the DPA would be secured.
The court therefore found that, although this controversial new technology does engage the private life rights of and sensitive processing of personal information of members of the public, it is proportionate and lawful in the circumstances of the pilot project being undertaken by South Wales police.
However, on 4 October 2019, only a month following the judgment, the Guardian reported that the Metropolitan police had apologised after passing images of seven people to a private company for use with their AFR technology at King's Cross. It is understood that the images were provided by the police between 2016 and 2018.
It is therefore apparent that the capturing and processing of facial biometric data by the police and the limits of the police powers to do so will be an area of continuing scrutiny and concern for the courts.