South Wales Police to bring back facial recognition cameras after report finds 'no risk of bias'


Live facial recognition cameras will be brought back into use by a Welsh police force after a report concluded the technology can be used in a way that does not discriminate based on gender, age or ethnicity.

South Wales Police had paused their use of the technology following a Court of Appeal judgement in 2020 that highlighted the need for more work confirming the software does not discriminate based on "race or sex".

A series of trials have since been carried out and a report published on April 5 by the National Physical Laboratory found the technology's settings can be operated in a way that creates no risk of bias or discrimination.

South Wales Police said this has confirmed their use of live facial recognition is not bias. The force's Chief Constable, Jeremy Vaughan, said they would resume using the technology and emphasised that this will help keep the public safe.

However campaign group Big Brother Watch said as the report highlights that certain settings must be used to mitigate against biases, police forces should not be using technology that has the ability to be discriminatory at all.

South Wales Police first started trialling the technology in 2017.

How does live facial recognition technology work?

A live camera feed compares faces it captures with a predetermined "watch list", in order to find people who are on that list.

Watch lists contain people who are suspected of involvement in crimes and who are deemed likely by police to be at the location where the facial recognition cameras are being used.

Data associated with a match is kept for up to 24 hours but in the event of no match, the data is immediately and automatically deleted.

South Wales Police say the decision to stop an individual is not made by the technology but involves two sets of officers, a team in the van and an ‘on the ground’ intervention team, making a decision on whether they believe the individual on the street matches the wanted person.

In 2017, South Wales Police started trialling the use of the technology. However a legal challenge was launched by Ed Bridges, whose face was scanned while shopping in Cardiff that same year and then again at a peaceful anti-arms protest in 2018.

While two judges previously ruled that the use of facial recognition was lawful, Mr Bridges appealed that ruling.

In 2020 the Court of Appeal found that South Wales Police's use of facial recognition technology did interfere with privacy and data protection laws.

In part of its findings, it said despite there being no clear evidence the technology was biased on grounds of race or sex, the force had not done all they could to verify this.


People in Cardiff were split on the idea of reintroducing the live facial recognition cameras, with some feeling insecure about the use of the technology.


Subsequent trials took place and on April 5, South Wales Police said data from these had undergone independent evaluation and found no issues with bias or discrimination in the way the cameras are used by the force.

Chief Constable Jeremy Vaughan, who is also the national policing lead on biometrics, said facial recognition technology is a "force for good" and has an important role in keeping communities safe.

He said: “The study confirms that the way South Wales Police uses the technology does not discriminate on the grounds of gender, age or race and this reinforces my long-standing belief that the use of facial recognition technology is a force for good and will help us keep the public safe and assist us in identifying serious offenders in order to protect our communities from individuals who pose significant risks.

"...I believe we are now in a stronger position than ever before to be able to demonstrate that the use of facial recognition technology is fair, legitimate, ethical and proportionate.”

South Wales Police said that prior to the Court of Appeal challenge, live-time deployments of facial recognition in the force area resulted in 61 people being arrested for offences including robbery violence, theft and failure to respond to court warrants.

The technology was used at things like major sporting and public events in Cardiff and Swansea.

Chief Constable Vaughan added: "It is right and proper that our use of technology is subject to legal challenge and scrutiny and the work that has been carried out to scrutinise and test this ground-breaking technology gives me confidence that we are meeting our equality obligations."

Ed Bridges argued the use of automatic facial recognition by South Wales Police violated his privacy and data protection rights.

Those against the use of the technology claim it is an invasion of privacy.

In the case of Ed Bridges, his lawyers argued the use of automatic facial recognition (AFR) by South Wales Police caused him "distress" and violated his privacy and data protection rights by processing an image taken of him in public.

Big Brother Watch call the technology "dangerously authoritarian surveillance" and say it signals "an enormous expansion of the surveillance state". It has launched a UK-wide campaign to stop the use of live facial recognition by the police and private companies.

In response to the report, Madeleine Stone, Legal and Policy Officer at Big Brother Watch said: "Live facial recognition is suspicion-less, mass surveillance that turns us into walking ID cards, subjecting innocent people to biometric police identity checks.

"This report confirms that live facial recognition does have significant race and sex biases, but says that police can use settings to mitigate them. Given repeated findings of institutional racism and sexism within the police, forces should not be using such discriminatory technology at all.

"Police forces have also refused to disclose the ethnicity breakdowns of watchlists, but at facial recognition deployments, we have repeatedly witnessed black men, even black children, being wrongly matched and stopped.

"1 in 6,000 people being wrongly flagged by facial recognition is nothing to boast about, particularly at deployments in large cities where tens of thousands of people are scanned per day. If rolled out across the UK, this could mean tens of thousands of us will be wrongly flagged as criminals and forced to prove our innocence."

South Wales Police say there has not been one single wrongful arrest as a result of their use of facial recognition.

Ms Stone added: "Live facial recognition is not referenced in a single UK law, has never been debated in parliament, and is one of the most privacy-intrusive tools ever used in British policing. Parliament should urgently stop police from using this dangerously authoritarian surveillance tech."