UK: Racist violence does not justify proposed expansion of police surveillance technology

Topic
Country/Region
UK

Following the racist pogroms that broke out across England at the end of July and beginning of August, the prime minister, Keir Starmer, announced a range of new policing measures - including a proposal for "wider deployment of facial recognition technology." A letter signed by more than two dozen organisations, including Statewatch, says that an expansion of live facial recognition "would make our country an outlier in the democratic world" and calls for the plan to be dropped.

Support our work: become a Friend of Statewatch from as little as £1/€1 per month.


Image: Jonathan McIntosh, CC BY-SA 2.0


09 August 2024

The Prime Minister
Rt Hon Sir Keir Starmer MP 10 Downing Street Westminster
London SW1A 2AA

cc: Rt Hon Yvette Cooper MP
Secretary of State for the Home Department

By email only.

Dear Prime Minister,

We are writing to you as a coalition of human rights, racial justice, migrants’ rights and civil liberties groups following your statement on the recent scenes of public disorder across the country and plans to introduce a “wider deployment of facial recognition technology.” [1] We are alarmed by the horrendous scenes of violence and chilling Islamophobic and racist attacks that have been taking place, which jeopardise the safety not only of those directly attacked but of marginalised communities across the country. Whilst we urge you to take robust action to stop the violence, protect our communities and bring those responsible for this criminal behaviour to justice, we have serious concerns regarding the use of facial recognition surveillance and urge you to drop any plans to expand police use of live facial recognition surveillance in particular.

When used in the context of policing, facial recognition technology (FRT) is a highly controversial form of biometric surveillance that currently faces restrictions and blanket bans in cities, states and countries in the US and Europe, [2] due to the serious threat it poses to privacy, freedom of expression and freedom of assembly. Live facial recognition (LFR) cameras subject thousands of passers-by to unwarranted biometric identity checks and invert the democratic principle of the presumption of innocence, by scanning and comparing the faces of members of the public en masse against a police watchlist. Despite the significant police resources expended by each LFR deployment, LFR continues to have issues with accuracy, [3] bias and discrimination. [4] On the day of your announcement to expand the police’s use of facial recognition surveillance, the EU’s AI Act came into force which broadly prohibits live facial recognition. [5] Should UK police forces expand the use of live facial recognition under your leadership, it would make our country an outlier in the democratic world.

It is notable that there is no explicit legal basis for FRT use by the police and it has never been debated by Parliament. The law governing authorities’ uses of facial biometrics is wholly inadequate, as identified by Matthew Ryder KC’s review (the Ryder Review), [6] and individuals’ rights to privacy, free expression and freedom of assembly are threatened by the use of LFR in particular. As you outlined in your address on 1st August, in times of crisis, upholding the rule of law is paramount – however, live facial recognition operates in a legal and democratic vacuum, and it is our view that its use for public surveillance is not compatible with the European Convention on Human Rights. As a public body, the police are under a duty to uphold human rights as outlined by Section 6 of the Human Rights Act. The legality of the Metropolitan Police Service’s use of LFR is currently subject to a legal challenge by an anti-knife crime community worker in London, Shaun Thompson, after he was misidentified by the technology and subject to wrongful police questioning and threats in the London Bridge area earlier this year. [7] In Mr Thompson’s words, "instead of working to get knives off the streets like I do, police were wasting their time with technology (that) had made a mistake". In the wake of the shocking stabbings in Southport, we believe these words bear very strong significance. In 2020, the Court of Appeal ruled that the South Wales Police had unlawfully deployed LFR surveillance. [8

We join you in condemning the racist, violent and disorderly scenes across the country. However, to rush in the use of technology which has a seriously negative bearing on our rights and freedoms would not only fail to address the causes of this dangerous violence, but set a chilling precedent, threaten the democratic rights of the very communities you are seeking to protect, and undermine Labour’s commitment to protecting human rights [9] and the UK's legal obligation to protect and uphold human rights under international law.

Live facial recognition surveillance would not make us, or the communities we represent, feel any safer. On the contrary, it puts our rights and the democratic health of the country more at risk. We urge you to rethink your plans to expand the use of facial recognition surveillance in the UK and would ask that you meet with us, as you have with police chiefs, to discuss the rights and equalities impacts of this AI mass surveillance. We look forward to hearing from you.

Yours sincerely,

Silkie Carlo, Director, Big Brother Watch
Antonia Lee, Stop the Scan Project Co-ordinator, Racial Justice Network
Chantal Joris, Senior Legal Officer, Article 19
Christina Tanti, Head of Research, Race Equality First
Deborah Coles, Executive Director, INQUEST
Fizza Qureshi, Chief Executive Officer, Migrants’ Rights Network
Gus Hosein, Executive Director, Privacy International
Habib Kadiri, Executive Director, StopWatch
Ilyas Nagdee, Racial Justice Director, Amnesty International
Jen Persson, Director, Defend Digital Me
Sara Chitseko, Pre-Crime Programme Manager, Open Rights Group
Kevin Blowe, Campaigns Coordinator, Network for Police Monitoring (Netpol)
Liz Fekete, Director, Institute of Race Relations
Minnie Rahman, Chief Executive, Praxis, for Migrants and Refugees
Nik Williams, Policy and Campaigns Officer, Index on Censorship
Romain Lanneau, Researcher, Statewatch
Shameem Ahmad, Chief Executive Officer, Public Law Project
Stephanie Needleman, Legal Director, JUSTICE
Tracey Bignall, Director of Policy and Engagement, The Race Equality Foundation
Yasmin Halima, Executive Director, The Joint Council for the Welfare of Immigrants
Access Now
European Network Against Racism
Faz
Amnesty
Northern Police Monitoring Project
No Tech for Tyrants
Revolving Doors
Street Fathers

Notes

[1] https://www.gov.uk/government/speeches/prime-minister-keir-starmers-statement-in-downing-street-1-august
[2] See, for example the outright bans in Belgium and Luxembourg and US State legislation such as the Illinois Biometric Information Privacy Act and Texas Biometric Privacy Law. Montana law restricting facial recognition use by police, public agencies takes effect – Chris Burt, Biometrics Update, 5th July 2023; San Francisco is first US city to ban facial recognition – BBC News, 15 th May 2019: https://www.bbc.co.uk/news/technology-48276660;
[3] To date, South Wales Police and the Metropolitan Police’s use of LFR have produced 75% incorrect matches since
the technology was first introduced.
[4] The Biometrics and Forensics Ethics Group warned that UK police’s use of LFR technology has the “potential for biased outputs and biased decision-making on the part of system operators.” See Biometrics and Forensics Ethics Group, Interim report, February 2019
[5] Artificial Intelligence Act: deal on comprehensive rules for trustworthy AI – European Parliament, 9th December
2023: https://www.europarl.europa.eu/news/en/press-room/20231206IPR15699/artificial-intelligence-act-deal- on-comprehensive-rules-for-trustworthy-ai
[6] https://www.adalovelaceinstitute.org/project/ryder-review-biometrics/
[7] BBC, ‘I was misidentified as a shoplifter by facial recognition tech,’ https://www.bbc.co.uk/news/technology- 69055945
[8] Liberty, Legal Challenge: Ed Bridges v South Wales Police, https://www.libertyhumanrights.org.uk/issue/legal- challenge-ed-bridges-v-south-wales-police/
[9] https://labour.org.uk/change/britain-reconnected/

Our work is only possible with your support.
Become a Friend of Statewatch from as little as £1/€1 per month.

Further reading

26 July 2024

UK: Call for "serious, meaningful protection" from police facial recognition technology

The UK's new Labour government must ensure "proper regulation of biometric surveillance in the UK," says a letter signed by nine human rights, racial justice and civil liberties groups, including Statewatch. "No laws in the UK mention facial recognition, and the use of this technology has never even been debated by MPs," the letter highlights. It calls on the new home secretary, Yvette Cooper, and the science, technology and innovation minister, Peter Kyle, to meet the signatory groups "to discuss the need to take action and learn from our European partners in regulating the use of biometric surveillance in the UK more broadly." A separate letter to Scotland's cabinet secretary for justice and home affairs raises similar points, and calls on the Scottish government "to stop the proposed use of live facial recognition surveillance by Police Scotland."

28 February 2022

Building the biometric state: Police powers and discrimination

This report examines the development and deployment of biometric identification technologies by police and border forces in Europe, and warns that the increasing use of the technology is likely to exacerbate existing problems with racist policing and ethnic profiling.

 

Spotted an error? If you've spotted a problem with this page, just click once to let us know.

Report error