Facial recognition software used by police wrong in more than 90% of cases

Facial recognition technology Credit: PA Wire/PA Images

Facial recognition software should be dropped by police amid concerns it is “almost entirely inaccurate”, campaigners have warned.

Figures revealed in response to Freedom of Information requests by Big Brother Watch have shown that, for the Metropolitan Police, 98% of “matches” found by the technology were wrong, and for South Wales Police the figure was 91%.

The software is used at major events like the Notting Hill Carnival, sporting fixtures and music concerts to detect people on a watch list, including wanted criminals.

Director of Big Brother Watch Silkie Carlo said: “We’re seeing ordinary people being asked to produce ID to prove their innocence as police are wrongly identifying thousands of innocent citizens as criminals.

“It is deeply disturbing and undemocratic that police are using a technology that is almost entirely inaccurate, that they have no legal power for, and that poses a major risk to our freedoms.

“This has wasted millions in public money and the cost to our civil liberties is too high. It must be dropped.”

Figures released by the Metropolitan Police showed there had been 102 false positives – cases where someone was incorrectly matched to a photo – and only two that were correct.

Neither of those was arrested – one was no longer wanted by police, and the other was classed as a “fixated individual” who attended a Remembrance Day event.

For South Wales, 2,451 out of 2,685 matches were found to be incorrect – 91%. Of the remaining 234, there were 110 interventions and 15 arrests.

The force used the software at various events including the Uefa Champions League 2017 final in Cardiff, international rugby matches and Liam Gallagher and Kasabian concerts.

Big Brother Watch said South Wales Police had stored pictures from both false positive and true positive matches for 12 months, potentially meaning images of more than 2,000 innocent people were stored by the force without the subjects’ knowledge. The force said the images were only stored as part of an academic evaluation for UCL, and not for any policing purpose.

The software used by SWP and the Met has not been tested for demographic accuracy, but in the United States concerns have been raised that facial recognition is less reliable for women and black people.

The report said: “Disproportionate misidentifications risk increasing the over-policing of ethnic minorities on the premise of technological ‘objectivity’.

“This issue will be further compounded if police disproportionately deploy automated facial recognition in areas with high BME (black and minority ethnic) populations, such as the Metropolitan Police’s repeated targeting of Notting Hill Carnival.”

It also highlights the fact that the software is used to search the Police National Database, which contains hundreds of thousands of images of innocent people.

“With more and more biometric images being fed into the database – subsets of which are used with real-time facial recognition cameras – innocent people are increasingly at risk of being wrongfully stopped or even arrested,” it said.

Big Brother Watch is calling for UK authorities to stop using automated facial recognition software with surveillance cameras, backed by Labour MP for Tottenham David Lammy and campaign groups including the Football Supporters Federation, Index on Censorship, Liberty, and the Race Equality Foundation.

A spokeswoman for Scotland Yard said the force is trialling facial recognition technology, and it was used at the previous two Notting Hill Carnivals and the 2017 Remembrance Sunday service “to assess if it could assist police in identifying known offenders in large events, in order to protect the wider public”.

Addressing false positives, the force said: “We do not consider these as false positive matches because additional checks and balances are in place to confirm identification following system alerts.”