Police IT buyers should compel suppliers to prove AI claims

0
292
Oracle enhances customer experience platform with a B2B refresh

Source is ComputerWeekly.com

Criminal justice sector (CJS) bodies procuring artificial intelligence (AI) technologies should use their purchasing power to demand access to suppliers’ systems to test and prove their claims about accuracy and bias, an expert witness has told a House of Lords inquiry.

Launched in May 2021 by the Lords Home Affairs and Justice Committee, the inquiry is looking at the use of new technologies by law enforcement, including AI and facial-recognition.

Speaking to the Committee about the procurement of advanced algorithmic technologies by police forces and CJS bodies, Sandra Wachter, an associate professor and senior research fellow at the University of Oxford, said there needs to be much greater transparency around how the systems purchased by law enforcement work internally.

Wachter has previously worked alongside Brent Mittelstadt, a senior research fellow in data ethics at the Oxford Internet Institute, and Chris Russell, a group leader in safe and ethical AI at the Alan Turing Institute, to develop an anti-discrimination test known as Conditional Demographic Disparity (CDD), which helps developers look for bias in their AI systems.

“When people don’t want to tell you how [an algorithm] is working, it’s either because they don’t want to, or because they don’t know. I don’t think either is acceptable, especially in the criminal justice sector,” she said, adding that while a balance needs to be struck between commercial interests and transparency, people have a right to know how life-changing decisions about them are made.  

“When people say it’s just about trade secrets, I don’t think that’s an acceptable answer. Somebody has to understand what’s really going on. The idea that liberty and freedom can be trumped by commercial interests, I think, would be irresponsible, especially if there is a way to find a good middle ground where you can fully understand what an algorithm is doing … without revealing all the commercial secrets.”

In the UK, public authorities are obliged under the Public Sector Equality Duty to consider how their policies and practices could be discriminatory. However, because the private companies these authorities procure technologies from often want to protect their intellectual property and trade secrets, there can be competing interests.

Unlawful use

In August 2020, for example, South Wales Police’s use of live facial-recognition technology was deemed unlawful by the Court of Appeal, in part because of the fact that the force did not comply with its PSED.

It was noted in the judgement that the manufacturer in that case – Japanese biometrics firm NEC – did not divulge details of its system to SWP, meaning the force could not fully assess the tech and its impacts.

“For reasons of commercial confidentiality, the manufacturer is not prepared to divulge the details so that it could be tested. That may be understandable, but in our view it does not enable a public authority to discharge its own, non-delegable, duty under section 149,” said the ruling.

Addressing the Committee, Wachter said AI developers often claim that they have removed any bias from their datasets through various “debiasing” techniques: “Whenever somebody says that, that’s probably a lie. The only way to debias data is to debias humans and collect the data they are producing … so when people are claiming those kinds of things we have to be very critical.”

She added that it was the same when developers make claims about the accuracy of their AI systems, which are all essentially designed to make predictions about future behaviour which are inherently difficult to verify.

“Making claims about accuracy can be quite irresponsible, as there’s a disconnect between what I’m predicting and what the thing is that I’m using to predict – I’m always using a proxy for it because I don’t actually know the future,” she said. “I think if a public body’s thinking of purchasing the technology, then it should use its purchasing power to demand access – ‘show me the work, if you can actually do that thing then show it to me’ – if they can’t, it’s a red flag.

“I understand that sometimes there is maybe no internal resource or time available but there are opportunities to work with researchers, independent researchers, that are not on the payroll of anybody, and have no stake in what the result actually is.”

Wachter added that once access to the systems and data have been provided, independent researchers would be able to test it to see if they can reproduce everything that has been claimed, so that law enforcement buyers can make informed decisions.

The Committee was previously warned on 7 September 2021 about the dangers of using AI and algorithms to predict, profile or assess people’s risk or likelihood of criminal behaviour, otherwise known as “predictive policing”.

Rosamunde Elise van Brakel, co-director of the Surveillance Studies Network, for example, noted that the data “often used is arrests data, and it has become very clear that this data is biased, especially as a result of ethnic profiling by the police”, and that all the time “this data has this societal bias baked in, the software will always be biased”.

She added: “The first step here is not a technological question, it is a question about how policing and social practices are already discriminatory or are already biased. I do not think you can solve this issue by tweaking the technology or trying to find AI to spot bias.”

Speaking to the Committee, David Lewis, a former deputy chief constable and former ethics lead at the National Police Chief’s Council for Dorset Police, agreed that as purchasers, police and other CJS bodies should not allow black-box solutions “where we don’t know what’s inside the box … that would be non-negotiable”.

He added: “When you’ve got something that needs scrutiny but can’t be exposed to the public gaze, the police service is actually quite experienced at bringing people close to the use of sensitive tactics or covert issues.

“We can use some of the practices that we’ve developed there, bringing in specialists, independent people who understand, but are properly vetted and are therefore not going to betray, in this case, trade secrets.”

Source is ComputerWeekly.com

Vorig artikelMuseums Use Technology to Stir Interest in the Artistic Past
Volgend artikelZuckerberg to Be Added to Facebook Privacy Suit