LONDON: The United Kingdom has become the only European nation to deploy real-time facial recognition technology on a massive scale across public spaces and retail environments.
Outside supermarkets and within festival crowds, millions of people are having their facial features scanned by AI-powered systems without their knowledge or consent.
At London’s Notting Hill Carnival, facial recognition cameras are positioned near entrances and exits to monitor the expected two million attendees celebrating Afro-Caribbean culture.
Metropolitan Police chief Mark Rowley stated that “an effective policing tool which has already been successfully used to locate offenders at crime hotspots resulting in well over 1,000 arrests since the start of 2024.”
The technology first underwent testing in 2016 and has seen substantial expansion throughout the United Kingdom over the past three years.
According to the NGO Liberty, approximately 4.7 million faces were scanned during 2024 alone through these surveillance systems.
UK police have deployed live facial recognition around one hundred times since late January, compared to just ten deployments between 2016 and 2019.
The system operates by scanning faces in crowds and comparing them against thousands of suspects already contained within police databases.
When a person on a police watchlist passes near the cameras, the AI-powered system triggers an alert from its often van-based setup.
Suspects can then be immediately detained once police officers confirm their identity through additional checks.
Big Brother Watch organisation criticised this approach, stating that such mass data capture “treats us like a nation of suspects”.
Rebecca Vincent, interim director of Big Brother Watch, told AFP that “there is no legislative basis, so we have no safeguards to protect our rights, and the police is left to write its own rules.”
Private use by supermarkets and clothing stores to combat rising shoplifting has also generated significant concern among privacy advocates.
Vincent added that “very little information” exists regarding how collected data is being used by these private entities.
Most retailers utilise Facewatch, a service provider that compiles lists of suspected offenders and alerts stores when they enter monitored premises.
Daragh Murray, a human rights law lecturer at Queen Mary University of London, warned that “it transforms what it is to live in a city, because it removes the possibility of living anonymously.”
Murray further noted that “that can have really big implications for protests but also participation in political and cultural life.”
Many shoppers remain completely unaware they are being profiled when entering stores using these surveillance systems.
Abigail Bevon, a twenty-six-year-old forensic scientist, expressed being “very surprised” upon learning how the technology was being deployed.
While acknowledging potential police benefits, Bevon complained that retail deployment felt “invasive” to personal privacy.
The European Union has prohibited real-time facial recognition technologies since February under new artificial intelligence legislation.
Vincent emphasised that “we do not see anything even close in European countries or other democracies” apart from limited United States usage.
She added that “the use of such invasive tech is more akin to what we see in authoritarian states such as China.”
Interior minister Yvette Cooper recently promised a “legal framework” governing technology use focused on “the most serious crimes”.
Her ministry nevertheless authorised police forces to deploy the technology in seven new regions earlier this month.
Permanent cameras are scheduled for installation in Croydon, south London, marking the first fixed deployment beyond mobile van setups.
Police assure they maintain “robust safeguards” including camera deactivation when officers are absent and deletion of non-suspect biometric data.
The UK’s human rights regulator declared Metropolitan Police’s technology policy “unlawful” for being “incompatible” with rights regulations.
Eleven organisations including Human Rights Watch urged police not to use facial recognition during Notting Hill Carnival.
These groups accused police of “unfairly targeting” the Afro-Caribbean community while highlighting documented AI racial biases.
Shaun Thompson, a thirty-nine-year-old black London resident, was wrongly identified as a criminal by facial recognition and subsequently arrested.
Thompson has filed an appeal against the police following his mistaken identification and detention. – AFP