Court rules South Wales Police's use of facial recognition technology breached privacy and data protection laws

Watch our video report by Mike Griffiths


The Court of Appeal has ruled that South Wales Police's use of facial recognition technology did interfere with privacy and data protection laws.

Activist Ed Bridges took South Wales Police to a High Court in London after he said his face was scanned twice using the technology.

Two judges previously ruled that the use of facial recognition was lawful but in November last year, Mr Bridges was given the right to appeal.

The case is thought to be the world's first legal challenge over the use of automatic facial recognition technology.

The chief constable of South Wales Police said they will give the Court of Appeal's findings "serious attention". Credit: PA Images

Ed Bridges, 37, had his face scanned while Christmas shopping in Cardiff in 2017 and then again at a peaceful anti-arms protest outside the city's Motorpoint Arena in 2018.

He claimed this use of the technology caused him "distress" and violated his privacy and data protection rights by processing an image taken of him in public.

In September 2019 a court dismissed Mr Bridges case, deeming the use of facial recognition lawful. With the help of civil rights group, Liberty, the activist then took his case to the Court of Appeal.



In Tuesday's ruling, three Court of Appeal judges ruled the force's use of facial recognition was unlawful, allowing Mr Bridge's appeal on three out of five grounds he raised in his case.

The judges said that there was no clear guidance on where Automatic Facial Recognition (AFR) Locate - the system being trialled by South Wales Police - could be used and who could be put on a watchlist.

It ruled that "too much discretion is currently left to individual police officers".

The court also found that a data protection impact assessment of the scheme was deficient and that the force had not done all they could to verify that the AFR software "does not have an unacceptable bias on grounds of race or sex".

The judgment notes that there was no clear evidence that the software was biased on grounds of race or sex.

In a statement, Mr Bridges said: "I'm delighted that the court has agreed that facial recognition clearly threatens our rights."

"For three years now South Wales Police has been using it against hundreds of thousands of us, without our consent and often without our knowledge. We should all be able to use our public spaces without being subjected to oppressive surveillance." 

Ed Bridges said the use of the technology on him caused him "distress" and violated his privacy and data protection rights.

Matt Jukes, chief constable of South Wales Police, said that he is "confident" today's ruling is something the force "can work with" and that they will give the court's findings "serious attention".

He added: "We are pleased that the court has acknowledged that there was no evidence of bias or discrimination in our use of the technology.

"But questions of public confidence, fairness and transparency are vitally important, and the Court of Appeal is clear that further work is needed to ensure that there is no risk of us breaching our duties around equality.

"In 2019 we commissioned academic analysis of this question and although the current pandemic has disrupted its progress, this work has restarted and will inform our response to the Court of Appeal's conclusions." 

AFR technology maps faces in a crowd by measuring the distance between features and then compares results with a "watchlist" of images - which can include suspects, missing people and persons of interest.

South Wales Police has been conducting a trial of the technology since 2017.