Facial recognition: the United Kingdom intensifies the surveillance of citizens

Published

Facial recognition: Millions of British have their face scanned in real time

Unique in Europe, this massive deployment of facial recognition in the United Kingdom raises concerns.

“Real -time facial recognition is an effective tool that has enabled more than 1000 arrests since the beginning of 2024,” said Mark Rowley, London police chief, who plans to “more than double his use”.

AFP

At the entrance to a supermarket, in the crowd of a festival: millions of British now have their face scanned by technologies of facial recognition in real time, in the only European country to deploy them on a large scale. At the Notting Hill carnival, where two million people are expected on Sunday and on Monday, cameras using this process were installed at the entrances and outings of the parade.

The objective, according to the police: “Identify and intercept” live people, by scanning the faces and comparing them to the thousands of suspects in its database. “Real -time facial recognition is an effective tool (…) which has enabled more than 1000 arrests since early 2024,” said Mark Rowley, London police chief, who plans to “more than double his use”.

“A country of suspects”

The use of these technologies has already increased considerably for three years, going from ten operations between 2016 and 2019 to a hundred since the beginning of 2025. In total, the faces of 4.7 million people were scanned in 2024 in the United Kingdom. The cameras are installed on the roof of a van, where police officers operate, and when a suspect passes nearby, the system using artificial intelligence triggers an alert allowing it to be stopped immediately.

Its “large-scale” use, on the occasion of the coronation of Charles III in 2023, or this year before the concerts of Oasis and the matches of the Six Nations tournament, transforms the United Kingdom into a “a country of suspects”, worries the Big Brother Watch organization. “There is no legislative basis (…) so the police have a free field to write their own rules,” said Rebecca Vincent, its interim director.

Its private use by supermarkets or stores to combat flights with a strong increase in increased increases, with “very little information” on their data collection. Most of them use Facewatch, a provider that is a list of suspects of offenses in the stores he monitors, and gives the alert as soon as one of them enters a business.

Prohibited in the EU

In the EU, the legislation framed since February, artificial intelligence has prohibited the use of facial recognition technologies in real time, with exceptions in particular for the fight against terrorism. Apart from a few cases in the United States, “there is nothing comparable in European countries or in other democracies, the use of this technology (in the United Kingdom) is more similar to that of authoritarian states like China,” said Rebecca Vincent.

“This changes the way of living in the city by removing the possibility of living in anonymity” and can discourage participation in particular in demonstrations, warns Daragh Murray, lecturer at Queen Mary University in London.

Cameras in difficult neighborhoods

The Minister of the Interior, Yvette Cooper, recently promised a “legal framework” to delimit her use, emphasizing the fight against “serious crimes”. Without waiting, the Home Office has just extended the use of this process to seven new regions of the United Kingdom.

Permanent cameras must also be installed in Croydon, a southern district of the capital deemed difficult. The police ensure having “robust guarantees”, promising to delete the biometric data of people who have nothing to blame themselves for. But the British regulator responsible for human rights considered that the use of this technology by London police was “illegal” because “incompatible” with respect for these rights.

Eleven organizations had urged the police to give up using it during the Notting Hill carnival, accusing it of “unjustly targeting” this community and insisting on the Racial biases of AI. They cite the case of Shaun Thompson, a black man arrested after being wrongly identified by one of these cameras, who has filed a legal action against the London police.

Comments (0)
Add Comment