Live facial recognition: The AI tech changing the way police fight crime
ITV News Correspondent Sejal Karia explores why the police's use of facial recognition technology is proving controversial
Words by ITV News Correspondent Sejal Karia and Investigations Producer Kieran Carter
It’s been described as the greatest innovation in policing since fingerprinting. Live facial recognition (LFR) technology is increasingly being used by police forces across the country in the fight against crime.
This year alone, London’s Metropolitan Police says it’s helped officers arrest 540 individuals wanted for criminal behaviour.
Of these arrests, more than 50 were of individuals allegedly involved in violence against women and girls, including strangulation, stalking, domestic abuse and rape.
In addition, the Met has arrested over 50 sex offenders who were found to be in breach of their court conditions.
Campaigners are worried however what this new technology may mean for civil liberties. They argue it runs the risk of turning the country into a "mass surveillance state", can lead to misidentification and lacks any specific laws governing its use.
This year, the Met Police has deployed LFR tech more than 160 times. We joined its officers in Croydon, South London to see how it worked.
The cameras themselves are attached to the top of a police van scanning all directions of the high street.
Signs are placed nearby to tell the public their faces are being scanned. If they don’t match a photo on file their image is automatically deleted.
Inside the van, officers monitor screens which alert them if the cameras spot a match.
We didn’t have to wait long.
Within minutes, someone triggered an alert. Their face had been caught by the cameras on top of the van and checked against over 15,000 people on the police "watchlist" that day.
All individuals wanted for alleged criminal offences or with court conditions imposed upon them.
This man, a registered sex offender. A few checks reveal he's adhering to the conditions imposed on him - and they let him go.
To see how quickly the technology could work, we put my photo onto the day’s watchlist. In less than a second of walking into the camera’s range, an alarm went off.
Nearby officers received a side-by-side comparison of my face and what the cameras have seen on a handheld device. It’s then up to them to decide if I am who the computer says.
By the end of the day there’d been nine alerts in total, all of them correctly identifying someone on the police’s watchlist. Six were of sexual offenders who were found to be complying with the conditions of their release.
The other three led to arrests, one of which involved a man wanted for six different offences of theft.
The Met Police says its LFR cameras have scanned over 700,00 faces this year. Over 1000 triggered alerts, but 27 of these were recorded as "false" – essentially the computer getting it wrong.
One of those 27 is Shaun Thompson, an anti-knife campaigner from London. In February this year he says he was stopped by police during an LFR deployment in London Bridge.
Police questioned him about his identity for over 20 minutes, asking for photo ID and fingerprints. He was eventually let go – an alleged case of mistaken identity he says left him shaken.
“It’s stop and search on steroids.” Shaun Thompson told ITV News.
“That's what I look at that as. It's worse than a stop and search. You know, at least you have to be suspected of doing something with a stop and search.”
The police say the locations of their deployments are intelligence led and target crime hotspots. Shaun told us he felt these areas disproportionately affect ethnic communities.
One of the ways the police combat racial bias in this technology is by controlling how confident the system needs to be it’s the wanted individual before triggering a match.
A report by the National Physical Laboratory into LFR says at the levels the police operate there’s no risk of racial bias, something that’s been shown to be an issue at lower levels. Despite this, Shaun’s not convinced.
“It's lazy policing," he said.
“You're relying on the machine to scan and do your job for you."
Mr Thompson is now bringing a legal claim against the Met Police following his stop in February. The force has declined to comment on his specific case.
Human rights group Big Brother Watch is assisting him in this claim.
Its director, Silkie Carlo, was there the night Shaun was stopped.
“[It’s] potentially the end of privacy in this country as we know it. It means that each of us can be biometrically scanned and checked against vast databases by the police for absolutely no reason,” Ms Carlo told ITV News.
“I think all of us as members of the public want the police to catch criminals, but we also don't want to live in an open prison.”
Critics like Big Brother Watch argue technology like this treats innocent members of the public as guilty suspects, subjecting them to a constant digital police line-up.
“It completely turns the presumption of innocence in this country onto its head.
“Suddenly everyone is a suspect and you have to be cleared by a computer system, if it gets it wrong, you still have to prove who you are.” Ms Carlo said.
Lindsey Chiswick, the Met Police’s director of intelligence and national lead on facial recognition disagrees.
“We have a technology here and now today that has already taken 500 wanted defenders off the streets of London. As a woman in London, I feel safer knowing that rapists are no longer wandering the streets,” Ms Chiswick said.
“I think it's really important to underline that an engagement conversation doesn't always end in an arrest. This is normal everyday policing.”
Despite this, Ms Chiswick recognised the need to engage with concerned communities more.
“I think we need to build trust. And that's what all of the engagement we do is.
“The Metropolitan Police having done really diligent work with an independent body so that we understand how we can operate this new technology safely,” she added.
Technology like this has been used by South Wales Police for years. It says that since 2019, it has had no misidentifications and has scanned over 2.6 million faces across 72 deployments.
The cameras are now being trialled by forces in Hampshire, Essex and Bedfordshire with Prime Minister Sir Keir Starmer announcing plans to roll it out more widely earlier this year.
Have you heard The Trapped? Listen as Daniel Hewitt exposes the UK's dirty secret