Police Technology and its dangers – The Gangs Matrix as a case study of potential data-driven harms
Youth Violence and “Gangs” have been a hot topic in the media and policy discourse for over a decade. The over-arching narrative is mostly that serious violence amongst young people in this country is on the rise and that the number of “gangs” perpetuating violence has grown exponentially on a nation-wide scale. On the other hand, several experts - often with lived experiences – are concerned that the Police do “not have a full understanding of who is in a gang and who isn’t in a gang” and that this lack of understanding has rendered the “War on Gangs” a “racialized war on working class black youths”.
Simultaneous to this increased focus on Youth violence, is the increase in Police use of data-driven technology to deal with serious crime. This includes the use of predictive policing tools, Facial Recognition Technology, Automatic Number Plate Recognition, social media monitoring and mobile device data extraction.
Increased use of data-driven technology by public authorities can be seen as a natural consequence of society becoming more and more technologically enhanced. However, when used to disproportionately Police minority communities, there is potential for serious discriminatory practice. The Metropolitan Police’s Gang Matrix database is the perfect case-study for Police use of Technology and the potential for Data driven harm.
One of the ways in which the Government and Police have sought to address the aforementioned lack of understanding regarding the composition of “Gangs” in the UK, is the Gangs Matrix police database.
This targeted policing tactic was introduced as a response to the 2011 London/England riots. It is important to examine the context within which the database was introduced and the driving forces for its inception, as they are indicative of how the Police and Government have approached the issue of youth violence over the last 10-15 years and how over-reliance on targeted, data-driven technology can cause further marginalization of youths from minority backgrounds, which in turn fuels the vicious cycle of youth violence.
The Gang’s Matrix – why was it established and what is its impact?
The England riots of 2011 were sparked by the Police shooting and killing of Mark Duggan. Trident Gang Crime Command is the police unit which conducted the operation that led to the shooting. This Unit was established in 1998 to primarily focus on tackling gun crime and gang-related violence within black communities in London. The operation significantly increased in size in 2000, as part of a fresh commitment by the Met to improve its performance in “tackling black community murders”, and in 2004, Trident further expanded to cover all gang related shootings in all the capital.
This targeted policing of specific minority communities, especially when disproportionately enforced with intrusive tactics such as Stop and Search, can be credited with exacerbating distrust between the community and the Police, which is a dynamic which throughout history has often exploded into rioting and clashes with the police: namely the 1965 Watts Riots in Los Angeles, the 1980 Brixton Riots, the 1992 Los Angeles riots (in response to the beating of Rodney King), the 2005 French riots and the 2014 Ferguson riots.
However, in 2011 instead of addressing an obviously problematic relationship between minority racialized communities and the Police, the government responded to the riots with a further embodiment of targeted racialized policing by establishing the Gang Matrix database – a secretive police database, designed as a “risk management” watchlist of people who the police designate as “gang nominals” based on vague criteria such as: social media activity, known criminal activity, referrals by third party institutions such as housing associations, Pupil Referral Units (PRUs) and other children and community services.
Statistics from 2022 show nearly 87% of the individuals on the database are Black, Asian or other minority ethnic (‘BAME’), and 79% per cent are Black. This is extremely disproportionate considering research from the Centre for Crime and Justice Studies shows just 27 per cent of those convicted of offences related to serious youth violence are Black.
Most people on the Matrix (65%) are considered by the police to be ‘low risk’. However, being on the Matrix can expose you to a wide range of ‘enforcement actions’, including exclusion from benefits, housing and education, as well as increased stop-and-search. Information on the Matrix is also shared with other bodies, such as immigration enforcement. There is a serious lack of transparency as the Metropolitan Police do not inform people if they are on the Matrix and do not provide any mechanism by which a person can appeal against their inclusion or ask for the data held about them to be reviewed.
In November 2018, the Information Commissioner’s Office (ICO) published an enforcement notice detailing how the Metropolitan Police’s use of the Gang’s Matrix breached data protection law continuously since it’s creation in 2011. The notice highlighted that the Met Police:
- Did not process personal data fairly or lawfully. Data was being shared in an unredacted form across a range of public and private bodies. Such sharing was disproportionate and unnecessary to prevent or detect crime.
- Processed personal data excessively in relation to its stated purpose. 64% of individuals were at low/zero risk of gang activity yet were retained on the matrix.
- Processed inaccurate data. Victims of gang related crime were incorrectly presumed to have gang associations themselves. There was no consistency of approach in determining what constituted ‘gang membership.’
- Retained and processed personal data longer than necessary. Even after a person was removed from the matrix, their personal data was retained on an informal list of ‘gang associates’.
- Failed to take appropriate measures against unlawful processing or accidental loss of personal private data. The data was unencrypted and often transferred in unsecured ways. Information governance was poor. The lack of control over the data in the matrix led to a significant data breach by Newham Council, for which it was fined £145,000 by the ICO.
It was also noted by the Information Commissioner’s Office that the data on the Matrix “cannot be said to be accurate”, and could include unsubstantiated claims about a person, and that this risked people being wrongly added to the Matrix and suffering “very significant unjustified impacts”.
Legal Challenge
There can definitely be said to be a growing recognition about the Matrix’s discriminatory disproportionality, amongst both regulators such as the ICO as well as other concerned groups such as The Monitoring Group, Amnesty International, Liberty, Stopwatch and UNJUST who have all published damning reports on the Met Police’s use of the Gangs Matrix. This is further reflected by the progress which has recently been made in court by Liberty and UNJUST in challenging the Metropolitan police over the controversial database.
Liberty were acting on behalf of Awate Suleiman, a musician and writer, who has been trying to find out if he is on the Matrix since 2019 and Unjust - a not-for-profit organisation which specialises in challenging unjust policing policies and practices that disproportionately impact people of colour and other minority ethnic groups. Liberty challenged the Met Police, with the intention of taking them to court by highlighting the police harassment which is caused by the database and other impacts such as victims of the secretive database facing exclusion from Job applications.
This legal challenge became a landmark case in November 2022, when a week before the case was due to be held at the Royal Courts of Justice, the Met agreed “wholesale change” is needed after admitting that the operation of the database was unlawful and that it breached the right to a private and family life through its disproportionate representation of black people, effectively settling out of court. The Met also agreed to the removal of the majority of individuals on the Matrix and agreed to inform people who ask if they are on the Matrix about who their data has being shared with and what that data was.
This successful legal challenge constitutes a very significant milestone for those advocating for human rights in relation to policing practice and disproportionate policing of minority communities However, there is of course still a lot to be done, and there are other aspects of Police use of data driven technology that remain a concern in terms of their racialising effects and impact on minority communities.
Other Examples of Police technology
Facial Recognition Technology
One of the areas of Policing that has seen some of the most significant technological enhancements is the process of surveillance and identification technologies. Facial Recognition Technology is one of the more innovative and controversial examples of this enhancement and has been subject to significant media and public attention in recent years as the Police have ramped up their use of this technology.
Automated facial recognition technology can identify faces in an image or video and cross-reference it to other databases to attempt to identify individuals (significant considering the above analysis of the disproportionality of these databases). Police are deploying high-resolution cameras in public spaces, in order to identify individuals that may be on a watch list or suspect list in real time. Research show that automated facial recognition technology disproportionality misidentifies women and people from black and minority ethnic groups. This makes it much more likely for the Police to wrongfully stop, question and possibly arrest them. In addition, the early deployments of racial recognition in the UK have mainly included events that serve minority ethnic communities such as the Notting Hill Carnival for the identification of “Persons of interest”.
It is important to note that it isn’t only concerned community and activist groups that are raising issues regarding the impact of Facial Recognition Technology. Public authorities in the US, namely San Francisco, the City of Oakland, and Somerville, Massachusetts have all voted to ban the use of facial recognition technology by all authorities including police departments, and in the UK the practice is currently subject to a legal challenge.
Automatic Number Plate Recognition
The use of Automatic Number Plate Recognition (ANPR) technology by the police in the UK has raised concerns about its potential for discriminatory practices. ANPR systems capture and analyse vehicle number plates, allowing the Police to track vehicles, identify stolen cars, enforce traffic regulations, and monitor the movements of individuals of interest. This technology hasn’t changed much since it was introduced more than fifteen years ago, however there are new proposals to link these systems with other existing video surveillance systems, and join together the analytic capabilities, including those from automated facial recognition systems.
Presently, cars can be ‘marked’ as being linked to certain crimes, leading to increased stop and search, made possible due to ANPR-triggered traffic stops. The databases used to determine which cars should be flagged, will be operating with similar (if not the exact same) algorithms and “intelligence” that drove the racial disproportionality of the Gangs Matrix. This means that the vague criteria used to add perceived “gang nominals” to the Matrix database, is also to be applied to the flagging and stopping of vehicles being driven young men from minority ethnic backgrounds, resulting in them being stopped and questioned at a disproportionate rate. If not adequately monitored, ANPR can easily be another catalyst for the over-policing of minority communities as it facilitates the targeted and intrusive policing of these communities. Lastly, ANPR can facilitate the search for license plates of a particular nationality and also the search for “motorhomes or caravans” which experts have concluded is suggestive of a potentially discriminatory focus on Travellers or Roma.
Mobile Fingerprint Technology
Lastly, mobile fingerprint technology is the latest innovation which is enhancing another long-used police capability; to identify individuals by their fingerprints. Using an app on an Officer’s phone and a handheld scanner, police are able to check fingerprints against existing police and government databases (including immigration records) in real time.
It is important to note that accuracy and reliability of fingerprint identification depends on the quality and diversity of the reference databases used. If the databases primarily consist of fingerprints from certain demographic groups, it can result in biased outcomes. For example, if databases are disproportionately composed of fingerprints from individuals who have been previously arrested or convicted, it can perpetuate systemic biases and disproportionately target specific communities, leading to potential discrimination.
Furthermore, reliance on fingerprint technology will contribute to the over-policing of specific communities, particularly those already subjected to higher levels of police scrutiny. This is likely to further contribute to the cycle of discrimination, as increased surveillance and fingerprint collection will not only perpetuate stereotypes and reinforce existing biases but also further marginalise youths from certain communities.
Conclusion and reflection
Other areas of Police Technology usage that this article hasn’t touched on in depth include social media monitoring (an important aspect of the Gangs Matrix selection criteria), call detail records, IMSI catchers (phone cell-site technology), body worn cameras and mobile phone data extraction. It can be said that these are aspects that have been subjected to less scrutiny in terms of their impact on minority demographics, but it remains vitally important to consistently monitor the Police’s use of this technology as the potential for discrimination remains extremely pertinent.
Overall, it is important to note that technology isn’t neutral. Technology does not randomly come into existence and is hardly ever purposeless. New technologies are products of their time, influenced by both political and social factors. Companies design technology with specific purposes in mind, such as solving a certain ‘crime’ challenge, possibly for a specific police client, and therefore are likely to incorporate many of the client’s ideas, assumptions, and pre-established narratives. This is especially true in cases whereby the technology adds another layer of discretion to low-ranking officers in their day-to-day engagements with the public.
Rather than declaring the technology as neutral or amoral, we need to to assess it in its wider context and consider that its development was likely by a company, possibly founded by a former intelligence or law enforcement professional, to be sold to other intelligence law enforcement bodies, with a view to making a profit. With that in mind, rather than seeing the creation and adoption of a new kind of technology in policing as neutral act, we should view them and scrutinise them in the same way as any other new policy or development in policing and subject it to proper independent scrutiny and equality impact assessments.
More attention needs to be paid to the labels that the Police, government and Justice system are assigning to our most disadvantaged youths and how this affects both their interactions with society but also their own self-view and self-esteem and how this is reflected within their behaviour. The Gangs Matrix is based on a vague racialised concept of “The gang”, and we mustn’t allow algorithm led technology to determine who, or what, a gang is.
Furthermore, with problems such as Youth Violence, the police’s use of the Gangs Matrix and the concerns that this article has outlined in relation to the databases’ racist disproportionality, highlight the urgent need for serious reform to the governments response to these issues, which is currently over-reliant on targeted intrusive policing which only cause further marginalisation.
These are issues which have well-researched deeply rooted socio-economic determinants, and efforts to address these problems must be focused on the addressing of inequality and marginalisation, as well as the creation of opportunities for our most disadvantaged youths. This approach is much more likely to have positive outcomes and to contribute to a virtuous dynamic rather than the vicious cycle. This approach is also more likely to positively influence the self-belief of our most disadvantaged youths which has been damaged by years of marginalisation and targeted policing. Data driven intelligence in relation to perceived “gang nominals” could have been used to identity the youths that are most in need of holistic support in order to reduce the marginalising effect of the unequal society we live in, rather than a mechanism for further intrusive policing which has extremely limited potential for positive outcomes.