Automated facial recognition technology – police checkpoints by the back door?
There are currently two forms of facial recognition technology that police forces in the UK are known to use. One is 'facial matching', which involves matching a still image of someone against an image database. The second is 'automated facial recognition' ('AFR') technology, which involves the real-time scanning and analysis of the faces of people in public places.
The deployment of AFR is of particular concern, given its capacity to scan large swathes of the public whilst they go about their daily lives, without their knowledge or their consent and stored, regardless of whether or not they are suspected of being involved in a criminal offence. Police forces have used the technology at festivals, football matches and peaceful protests, as well as in crowds of Christmas shoppers.
AFR has been used by police forces since at least 2015, despite there being no legal basis for it or any parliamentary scrutiny of it. The Home Office was due to publish a "Biometric Strategy" as long ago as 2013, which was then expected to encompass the use of such technology, yet none has as yet been produced. The government has now stated that it will be provided this month. The Home Office's approach to date has been to leave it to police forces to decide if and how they operate using AFR.
Two legal challenges are now being proposed against South Wales Police and the Metropolitan Police Service respectively in relation to their use of AFR. Both forces will face legal action if they do not halt their use of AFT immediately. The Metropolitan Police Service previously confirmed that it would not be using AFT at the Notting Hill Carnival this year as it has done in previous years, but that it planned to continue to trial the technology elsewhere.
Despite the lack of guidance or policy, or provision for any independent oversight of its deployment, the Home Office has reportedly awarded South Wales Police a total of £2.8 million from the 'police transformation fund' to lead the charge in the roll out of AFR. A Home Office webpage describes the 'police transformation fund' as being "is intended to transform policing by investing in digitalisation, a diverse and flexible workforce and new capabilities to respond to changing crimes and threats".
However concerns surrounding police use of AFR are three-hold:
- It breaches data protection laws, as its gathers personal data despite there being no regulation for its use. Given its propensity to indiscriminately capture the personal data of unsuspecting members of the public, the processing of the same cannot be said to be necessary and proportionate for the purposes of law enforcement.
- It breaches individual's human rights. Under the Human Rights Act 1998, any interference with an individual's right to a private life, as enshrined by Article 8 of the European Convention on Human Rights, must be necessary and proportionate. AFR's capacity to interference with privacy rights is self-evident and police forces' blanket use of AFR at public places and peaceful gatherings appears to be neither proportionate nor necessary. Moreover, the knowledge that public authorities have the ability to record your attendance in a specific place or at a specific event without your knowledge will inevitably influence upon people's behaviour and, most crucially, their freedom of expression.
- Current AFR technology is completely inaccurate. "False positives" have been repeatedly identified. Freedom of Information Act requests made by 'Big Brother Watch' established that the Metropolitan Police Service's technology inaccurately matched innocent people 98% of the time. Furthermore, the technology is particularly inaccurate with regards to identifying women, and people who are black or minority ethnic. With regards to the latter, this raises the clear risk that the need for police to tackle the disproportionate misidentifications within these communities will be used by authorities as a basis for even greater over-policing of them.
Contact us for Specialist Legal Advice
Whether a social media website, search engine, local authority, police force or any other body holds data on you, you have the right to know what that data is and understand why it is being held. Where mistakes are made, such as inaccurate information being held, or your data being unlawfully disclosed to third parties, you have a right to seek redress. If you are concerned about how a private company or public authority is handling your data contact us today, our solicitors may be able to assist you.