Procuring law enforcement tech needs greater scrutiny

0
389
Oracle enhances customer experience platform with a B2B refresh

Source is ComputerWeekly.com

A panel of experts from the US, Belgium and New Zealand have briefed the House of Lords Justice and Home Affairs Committee on the risks of allowing tech firms to push technology into policing.

The MPs quizzed the panel on the use of surveillance technology such as body-worn cameras and law enforcement algorithms.

Elizabeth Joh, Martin Luther King Jr professor of law at the University of California Davis School of Law, told the committee that there has been outright experimentation in the use of technology for policing in the US. She warned of the influence of private sector companies, which she said had influenced the adoption of technology in police forces. 

“It is the private sector that has developed the technology for law enforcement,” said Joh. “They have an enormous stake and provide incentives.”

For instance, law enforcement agencies are being enticed to use free body-worn cameras for a year, she said. But when the trial is over, the police forces are then locked into using the company’s software and services to access all the surveillance data they have collected. 

Joh said the police departments are obliged to send the data collected by the body-worn cameras to the company’s own data store, which can only be accessed using its software. “After a year, many police forces have so much body-camera data that they are obligated to use the company’s software,” she added. 

Rosamunde Elise Van Brakel, research professor in surveillance studies at the Vrije Universiteit in Brussels, Belgium, and co-director of the Surveillance Studies Network, told the committee: “There is no transparency on the rules on procurement. There is no public information about how decisions are made.”

Van Brakel said that among the challenges police departments face is the fact that they often lack sufficient expertise. “I think an important factor is having the expertise to understand what the tech companies are promising the technology will do,” she said. In some European countries, some police departments have opted not to buy technology from US tech providers, she added.  

Van Brakel also discussed the need to establish oversight bodies and regulations. But some laws, such as those covering data protection, are too restrictive, she said. “Sector-specific regulation could be very helpful because data protection is very much focused on data protection online or in the private sector. There are few guidelines on how the public sector should implement data protection.”

For instance, general regulations cannot capture the specifics for organised crime or child protection, she said.

Before making an investment in law enforcement technology, said Van Brakel, there needs to be an assessment of proportionality, which takes into account compliance with regulations and how the deployment of the technology will impact society, democracy and citizen rights. “How does the technology empower the police to fulfil their societal goal?” she said. “This question is not asked enough. There is not enough reflection on how it helps the police.”

In the worst case, said Van Brakel, regulations and the rights of citizens will be watered down, which may result in certain groups in society being subjected to the technology in a disproportionate way.

Colin Gavaghan, chair of the advisory panel on emergent technologies at New Zealand Police and director of the New Zealand Foundation Centre for Law and Policy in Emerging Technologies, University of Otago, said reducing bias in an algorithm is not simply about removing a single data point, such as ethnicity, because the bias is spread across multiple datasets.

When collecting crime data for the purpose of machine learning, Gavaghan recommended that police departments draw in data from crime reports from the public, “to break the feedback loop”. This happens when the algorithm tells the police where a crime is likely to occur. They investigate, find a crime and their report enforces the algorithm’s decision.

Source is ComputerWeekly.com

Vorig artikelICO in bid to end cookie pop-ups
Volgend artikelTLA Black Women in Tech to launch book about tech role models