UK court finds facial recognition technology used by police was unlawful
Getty Images, Kyle Walsh
The use of automatic facial recognition technology by a U.K. police force in South Wales was unlawful, the Court of Appeal ruled Tuesday, in what is being hailed as a landmark judgement.
Like other versions of the technology, the facial recognition software used by South Wales Police (SWP) automatically scans the faces of pedestrians without them knowing and compares the faces to images on a database of persons of interest.
Three judges found SWP had breached privacy rights, data protection laws and equality laws by deploying the technology called “AFR Locate.” They specifically looked at two instances where it was deployed, however the police used it on around 50 occasions between May 2017 and April 2019.
The case was brought to court by 37-year-old Cardiff resident Ed Bridges, who is also a civil liberties campaigner. He has been supported by civil liberties organization Liberty.
The Court of Appeal ruled there is no clear guidance from the U.K. privacy regulator on where AFR Locate can be used and who can be put on a police watchlist. It also ruled that the police force’s data protection impact assessment was deficient and that SWP did not take reasonable steps to find out if the software had a racial or gender bias.
The ruling comes after two senior judges at London’s High Court dismissed Bridges’ claim in September 2019, ruling that the technology was in fact lawful.
The Court of Appeal upheld three of the five points raised in the appeal, however, effectively making the technology unlawful until it is approved by the U.K. government.
Bridges, who was in the vicinity of two deployments of AFR Locate by SWP, said he was “delighted” with the decision.
“This technology is an intrusive and discriminatory mass surveillance tool,” he said. “For three years now South Wales Police has been using it against hundreds of thousands of us, without our consent and often without our knowledge. We should all be able to use our public spaces without being subjected to oppressive surveillance.”
Bridges had his face scanned while he was Christmas shopping in Cardiff in December 2017 and at an anti-arms protest which was held at the Motorpoint Arena in March 2018.
SWP said it had no plans to appeal the decision.
In January, London’s Metropolitan Police force said it planned to start using live facial recognition cameras across the city.
Less than half of the British public (47.5%) trust the use of facial recognition technology to benefit society in the coming five years, according to a study from U.S. tech firm VMware. The same study found that 54% of the British public advocate for restricted access to biometric data, which includes things like facials images or fingerprint data.
Joe Baguley, VP and CTO of VMware in EMEA, said it was right to show caution with facial recognition technology at this time.
“Recent cases have seen high instances of ‘matches’ that are later labelled as false positives, suggesting the technology does not yet possess enough intelligence to guarantee accurate results or overcome any unconscious bias which may have impacted its development,” he said.