Human rights group Liberty has criticised the UK’s governments proposed update to its “surveillance camera code of practice”, claiming it does not properly take into account court findings on the use of live facial-recognition (LFR) technology by police, or the dangers such a surveillance tool presents.
Guidance on the use of surveillance camera systems by UK police and local authorities was implemented in June 2013, but has not been revised in the eight years since.
According to the government’s website, the proposed draft would update the guidance to reflect the passage of the Data Protection Act in May 2018, as well as the Bridges v South Wales Police (SWP) ruling from August 2020, which deemed the force’s use of LFR technology unlawful.
According to that judgement, SWP’s use of the technology was “not in accordance” with Cardiff resident Ed Bridges’s Article 8 privacy rights; it did not conduct an appropriate Data Protection Impact Assessment (DPIA); and it did not comply with its Public Sector Equality Duty (PSED) to consider how its policies and practices could be discriminatory.
The updated code of practice now says that LFR deployments must take into account the PSED and any potential adverse impact on protected groups; be justified and proportionate; quickly delete any unused biometric data collected.
Police force’s will also need to follow a stricter authorisation process, which will need to be decided by chief police officers, and publish the categories of people to be included on LFR watchlists, as well as the criteria that will be used in determining when and where to deploy the tech.
The government has opened a consultation on the updated code, which ends on 8 September 2021, which is open to a “wide range of stakeholders.”
Poor guidance
However, Megan Goulding, a lawyer at human rights group Liberty who was involved in the Bridges case, told IT Pro: “These guidelines fail to properly account for either the court’s findings or the dangers created by this dystopian surveillance tool.
“Facial recognition will not make us safer, it will turn public spaces into open-air prisons and entrench patterns of discrimination that already oppress entire communities.” She added: “It’s impossible to regulate for the dangers created by tech that’s oppressive by design,” and that the safest solution was to ban the technology.
A petition launched by Liberty to ban the use of LFR by police and private companies has reached 57,568 signatures by the time of publication.
Although the 20-page code of practice outlines 12 guiding principles that surveillance camera system operators should adopt, LFR is only explicitly mentioned six times at the very end of the document, and does not go into much detail.
“I don’t think it provides much guidance to law enforcement, I don’t really it provides a great deal of guidance to the public as to how the technology will be deployed,” Tony Porter, the UK’s former surveillance camera commissioner, told the BBC.
Porter, who is now chief privacy officer for facial-recognition supplier Corsight AI, added the code is very “bare bones” as currently written, and further questioned why Transport for London (TfL), which owns thousands of cameras, is not covered in the new code when smaller councils are.
In response to the criticism, the Home Office said: “The government is committed to empowering the police to use new technology to keep the public safe, whilst maintaining public trust, and we are currently consulting on the Surveillance Camera Code.
“In addition, College of Policing have consulted on new guidance for police use of LFR in accordance with the Court of Appeal judgment, which will also be reflected in the update to the code.” It added that all users of surveillance camera systems including LFR are required to comply with strict data protection legislation.
Calling for bans
In June 2021, two pan-European data protection bodies – the European Data Protection Board (EDPB) and the European Data Protection Supervisor (EDPS) – jointly called for a general ban on the use of automated biometric identification technologies in public spaces, arguing that they present an unacceptable interference with fundamental rights and freedoms.
This would include banning the use of AI to identify faces, gait, fingerprints, DNA, voices, keystrokes as well as any other biometric or behavioural signals, in any context.
“Deploying remote biometric identification in publicly accessible spaces means the end of anonymity in those places,” said Andrea Jelinek, EDPB chair, and Wojciech Wiewiórowski, the EDPS, in a joint statement. “Applications such as live facial recognition interfere with fundamental rights and freedoms to such an extent that they may call into question the essence of these rights and freedoms.”
While the UK’s information commissioner, Elizabeth Denham, did not go as far as her European counterparts in calling for a ban on LFR and other biometric technologies, she said in June that she was “deeply concerned” about the inappropriate and reckless use LFR in public spaces, noting that none of the organisations investigated by her office were able to fully justify its use.
“Unlike CCTV, LFR and its algorithms can automatically identify who you are and infer sensitive details about you. It can be used to instantly profile you to serve up personalised adverts or match your image against known shoplifters as you do your weekly grocery shop,” she wrote in a blog post.
“It is telling that none of the organisations involved in our completed investigations were able to fully justify the processing and, of those systems that went live, none were fully compliant with the requirements of data protection law. All of the organisations chose to stop, or not proceed with, the use of LFR.”
Other digital rights campaign groups, including Big Brother Watch, Access Now, and European Digital Rights, have also previously called for bans on the use of biometric technologies, including LFR.