Police tech introduced with little scrutiny or training

0
466
Oracle enhances customer experience platform with a B2B refresh

Source is ComputerWeekly.com

UK police do not have the resources to properly scrutinise their deployments of new algorithmic technologies and receive very little training on how to operate the systems introduced, senior police officers and staff have told Lords.

Speaking to the Lords Home Affairs and Justice Committee, which opened an inquiry into the use of advanced algorithmic tools by law enforcement in May 2021, the national policing chief scientific adviser at the National Police Chief’s Council (NPCC), Paul Taylor, highlighted the conflicting interests between police forces and their technology suppliers.

“To be able to sufficiently scrutinise these new technologies and all the permutations, to be able to 100% tell you what they are and aren’t doing, requires a level of resource that we simply do not have – the amount of testing and testing and testing one would need is just not there,” he said.

“If I turn around to industry and say ‘I’m going to hold you responsible’, they will then want to do a level of testing that probably makes investment into that technology not viable anymore; not an interesting product for them to engage into.

“There’s a tension there [because] we don’t want to stifle the market by saying that you have to do all of this [testing], but equally, of course, we need them to do it, and so that’s a real challenge for us,” said Taylor.

He added that part of the problem is that police are often taking a fairly mature technology and trying to implement it in a policing context it was not explicitly designed for.

“Procurement, in my mind, really needs to be much more upstream so that we’re working with providers – be those academia, be those industry – so that when they develop their products, at a much earlier stage, they think about issues of fairness and how it will work in policing, and they are designing things that fit,” he said.

Ethics committees

Taylor further added that many of the 43 forces in England and Wales have established ethics committees to specifically deal with the introduction and operation of new technologies, which his role as chief scientist provides an additional layer of independent scrutiny to.

Joint lead for data and bioethics at the Association of Police and Crime Commissioners (APCC), Alun Michael, said that scrutiny is also provided by the police and crime commissioners (PCCs), an elected position introduced by the 2011 Police Reform and Social Responsibility Act to make police forces more answerable to their local communities.

On whether police officers receive adequate training to be able to reliably and effectively operate new algorithmic technologies being introduced, David Tucker, the faculty lead on crime and criminal justice at the College of Policing, said that while there is training implemented at a national level for UK-wide systems like the Police National Computer (PNC) and Police National Database (PNC), there is no training in place for most other technologies like facial-recognition, as they are locally procured.

“There are national training programmes in place for those because they are national products,” he said. “For most of the other technologies, they are locally procured, so it’s not something the College [of Policing], as a national training provider, could do.”

He added that “there’s actually very little mandatory training in policing”, and that whether training is designed and delivered nationally or locally, the teaching of all UK police officers is underpinned by “accountability, transparency, ethics [and] legality”.

Both Tucker and Taylor further added that training is, however, provided by the technology suppliers themselves.

Citing a lack of clarity, Lords requested a full breakdown of the “family tree of organisations” – including PCCs, the NPCC, the APCC, the various national coordination committees and more – that are responsible for the operation of new technologies in policing.

Embedding better scrutiny through procurement practices

In the UK, public authorities are obliged under the Public Sector Equality Duty (PSED) to consider, for example, how their policies and practices could be discriminatory. However, because the private companies these authorities procure technologies from often want to protect their intellectual property and trade secrets, there can be competing interests.

In August 2020, for example, South Wales Police’s use of live facial-recognition technology was deemed unlawful by the Court of Appeal, in part because of the fact that the force did not comply with its PSED.

It was noted in the judgement that the manufacturer in that case – Japanese biometrics firm NEC – did not divulge details of its system to SWP, meaning the force could not fully assess the tech and its impacts.

In October 2021, the committee heard from Sandra Wachter, an associate professor and senior research fellow at the University of Oxford, that while a balance needs to be struck between commercial interests and transparency, people have a right to know how life-changing decisions about them are made. 

“When people don’t want to tell you how [an algorithm] is working, it’s either because they don’t want to, or because they don’t know,” she said. “I don’t think either is acceptable, especially in the criminal justice sector.

“When people say it’s just about trade secrets, I don’t think that’s an acceptable answer. Somebody has to understand what’s really going on. The idea that liberty and freedom can be trumped by commercial interests, I think, would be irresponsible – especially if there is a way to find a good middle ground where you can fully understand what an algorithm is doing … without revealing all the commercial secrets.”

On embedding better procurement practices in policing, Tucker said forces need to be “intelligent customers” by understanding the problem they are trying to solve and how technology will contribute to that problem-solving process.

“Where it’s not possible to do that testing in advance, there [needs to be] clear baselines so that evaluation can happen, and so that people can make an assessment about whether that technology is delivering what it’s supposed to deliver,” he said.

Speaking to the same committee in October 2021, Karen Yeung, an Interdisciplinary Professorial Fellow in Law, Ethics and Informatics at Birmingham Law School, said a key issue with police deployments of new technologies is that authorities have started using them “just because we can … without clear evidence” about their efficacy or impacts.

Such technologies include facial-recognition and algorithmic crime “prediction” tools such as the MPS’s Gangs Matrix or Durham Constabulary’s Harm Assessment Risk Tool.

Yeung further noted that the use of such technologies by police, especially without rigorous scientific testing, has the potential to massively entrench existing power discrepancies in society, as “the reality is we’ve tended to use the historic data that we have, and we have data in the masses, mostly about people from lower socio-economic backgrounds”.

Source is ComputerWeekly.com

Vorig artikelHow Elizabeth Holmes's Defense Is Following a Common Playbook
Volgend artikelHow the $4 Trillion Flood of Covid Relief Is Funding the Future