Since 2015, Automated Facial Recognition (AFR) technology has been creeping into the UK, and has been used by three police forces so far. The controversial technology scans peoples faces in public places, and then compares the biometric data collected from that, to persons of interest on the police’s watchlist.
Many of these operations have been conducted covertly, and without the consent of those who have been scanned. This, of course, has been met with resistance and backlash from human rights campaigners.
One man in particular, Ed Bridges, supported by Liberty, a human rights organisation, launched a groundbreaking legal challenge against the use of AFR, by South Wales Police (SWP). This case was the first of its kind in the world.
Since 2015, the South Wales, Leicestershire and Metropolitan Police have been using facial recognition technology in public places in the UK to help with investigations.
According to Assistant Chief Constable Richard Lewis at SWP, the use of the technology has been essential: “The world we live in is changing and with that comes a need to change the way we police. We are investing in ensuring our officers have the tools and technology needed to most effectively protect our communities. As technology evolves into the future, so too will the way our police force operates.
But how exactly does AFR software work? Well, it first requires an existing database of images, which are obtained through the police’s watchlist, against which other facial images and biometrics are compared. These “other” facial images are acquired through using CCTV cameras which take digital pictures of people in public areas. Using the CCTV’s live footage, the software then detects a human face by zooming in and isolating an individual facial image. The software then extracts a person’s unique facial features, and converts these features into a biometric template. Following this, the biometric template extracted from the CCTV, is compared to those on the police watchlist. A “similarity score” is then generated. Interestingly, the threshold value of the “similarity score” can often be set by the end user, meaning it can be as high or as low as they desire.
This kind of technology has predominantly been used by the SWP. Since May 2017, it has used AFR technology over 50 times. According to court documents, the police force “may” have also collected sensitive facial biometric data from 500,000 people, the majority of which were not suspects of any crime or wrongdoing.
Despite collecting personal data from hundreds of thousands of individuals, the technology itself is deeply flawed. According to a Freedom of Information (FOI) request, the technology resulted in ‘true matches’ with less than 9% accuracy.
One specific example of the technology’s inaccuracy occurred when the SWP used it at the Real Madrid v Juventus game in Cardiff. It was reportedly used because 170,000 people were in the city for the match. However, data on the SWP’s website revealed that the AFR software misidentified 2,297 as criminals.
Considering the prevalence of false positives, and the masses of data collected without consent, it would perhaps be assumed that the technology was at least regulated by an independent oversight body, or have an explicit legal framework authorising its use. But there is neither.
Aware of the quickly evolving technological landscape of surveillance in the UK, many have been calling for a policy review for years. This includes, surveillance camera commissioner Tony Porter. In a recent blog post, the commissioner stated that for years he had been “crying out” for the Surveillance Camera Code of Practice to be updated. This was a concern he raised way back in 2016, in an official review of the code, where he made strong recommendations to strengthen the legal framework.
Despite this, the Home Office ignored recommendations to update the seven-year-old code, and while the Home Office’s 2018 biometrics strategy promised it would initiate an update, two years later, there have been no developments.
Additionally, according to a report by the Ada Lovelace Institute, the usage of AFR has not garnered public support, either. Published in September 2019, the report revealed that 55% of people want the government to impose restrictions on the police’s usage of AFR. Meanwhile, 46% of those interviewed stated that they wanted the right to opt-out of this kind of technology.
Back in 2018, campaigner and former Liberal Democrat councillor Mr Bridges, announced that he believed he was one of many individuals scanned by AFR technology in South Wales. He stated that he believed this occurred on not only one, but two occasions, including when he attended a peaceful anti-arms protest.
Those who attended the protest outside the Defence, Procurement, Research, Technology and Exportability Exhibition, were not notified that such technology would be deployed prior to the event.
Initially, Mr Bridges, represented by Liberty, contacted Constable Matt Jukes from SWP. In a letter, he called for the police to immediately end their use of AFR technology.
The letter, sent by Liberty, on behalf of Mr Bridges, outlined that the deployment of the technology violated a number of different rights and protections. It stated that use of the technology violated both Article 10 (the right to freedom of expression and freedom of assembly), and Article 8 (the right to privacy) of the European Convention on Human Rights (ECHR). Additionally, the letter argued that due to the technology predominantly misidentifying BAME people and women, it contravened the Equality Act 2010. It also claimed that the SWP’s usage of the technology violated data protection laws.
Upon the police force’s refusal to cease its use of AFR, Liberty, on behalf of Mr Bridgers, launched the world’s first legal challenge against the technology. However, unfortunately, the court held that although technology did interfere with Article 8 of the ECHR, it was proportionate, and the force was not required to have new statutory powers to regulate it. This was due to the fact that it was only used for limited time, and was apparently publicised prior to usage.
Moreover, the court outlined that the technology had led to arrests and disposals in 37 cases, all of whom were unable to be found by other means.
The court also ruled that the use of AFR was not disproportionate, and that the processing of individual biometric data was not unwarranted, due to the impact assessment carried out by the police. The claim that the technology also breached Section 149(1) of the Equality Act 2010 was rejected, too.
Liberty on behalf of Mr Bridges launched an appeal against this ruling, which was granted.
In a written submission to the Court of Appeal, Dan Squires QC, acting for Liberty and Mr Bridges, stated that a national roll out of the technology would “radically” change the way in which Britain is policed. He added: “It is not difficult to imagine that police forces nationally could soon – if they cannot already – have access to photographs of the vast majority of the population”.
Ultimately, the appeal was made on five grounds:
On 11 August 2020, the Court of Appeal ruled in favour of Mr Bridges and Liberty, and declared that the SWP use of the technology was in fact unlawful, based on grounds one, three and five.
Speaking about the ruling, the claimant, Mr Bridges said: “I’m delighted that the Court has agreed that facial recognition clearly threatens our rights.This technology is an intrusive and discriminatory mass surveillance tool. For three years now South Wales Police has been using it against hundreds of thousands of us, without our consent and often without our knowledge. We should all be able to use our public spaces without being subjected to oppressive surveillance”.
Liberty lawyer Megan Goulding, added:“This [judgement] is a major victory in the fight against discriminatory and oppressive facial recognition. The Court has agreed that this dystopian surveillance tool violates our rights and threatens our liberties. Facial recognition discriminates against people of colour, and it is absolutely right that the Court found that South Wales Police had failed in their duty to investigate and avoid discrimination”.
Further to this, she outlined that it is time to outlaw the technology completely: “It is time for the Government to recognise the serious dangers of this intrusive technology. Facial recognition is a threat to our freedom – it needs to be banned,” she said.
The fight to ban the technology continues, and while this battle may have been won, the war is not over. Considering the ruling by the court outlined that the benefits of the technology “were potentially great,” while the impact on Mr Bridges, was merely “minor,” unfortunately, it appears that this is not the last we’ll hear from AFR technology.
That being said, hopefully this case will spark a full review and update of the legislative framework governing the use of surveillance. This is the only way to ensure that UK police do not encroach on citizen’s human rights, and that individuals are not further discriminated against.