Met police deploy facial recognition in Westminster

0
277
Oracle enhances customer experience platform with a B2B refresh

Source is ComputerWeekly.com

London police deployed live facial recognition (LFR) technology in Westminster on Friday 28 January, resulting in the arrest of four people and attracting significant criticism from civil rights groups.

The Metropolitan Police Service (MPS) said its facial recognition deployment – which took place the day after the UK government relaxed mask wearing requirements – was part of a wider operation to tackle serious and violent crime in the borough.

According to the MPS, one arrest was of a man wanted on an extradition warrant related to alleged drugs offences and serious assault, while the other three were for unspecified drugs offences, an unspecified traffic offence, and a man wanted in connection with alleged death threats.

The suspects were engaged and detained by officers following alerts from the vehicle-mounted LFR system, which enables police to identify people in real time by scanning their faces and matching them against a database of facial images, or “watchlist”, as they walk by.

Computer Weekly contacted the MPS about various aspects of the deployment – including how many vans were deployed throughout Westminster, how many match alerts the system made, how many stops were carried out, and how many people’s biometric information was processed – but it had not provided answers over a week later.

Silkie Carlo, director of civil liberties group Big Brother Watch, who was present at one of the deployments in Oxford Circus, told Computer Weekly she witnessed four stops while she was present.

“Of the four stops we saw, two shouldn’t have happened – one was in relation to outdated data and another was a straightforward misidentification,” she said, adding that the person who was misidentified, a young black boy, had his fingerprints taken. “There may have been many more misidentifications – that was simply in the short time I was there.”

Carlo said the experience of misidentification by facial recognition could have “a profound effect” on the individuals subjected to it. “This boy who was stopped on Friday had four or five police officers surrounding him, they took his fingerprints, barking questions at him,” she said.

“The police officers aren’t saying, ‘You match the description’ – which is obviously what a lot of teenagers have had to put up with in London for a long time – but, ‘You’ve been flagged by our system’. That must be incredibly disempowering. It’s a lot more [they’re] up against [because they] have to then start proving who they are, and trying to prove they’re not the person they [the police] think they are.”

On the four arrests made, Carlo said there needs to be more clarity from the MPS about the nature of the offences, especially considering its claim the tech was only deployed to look for “serious and violent offenders”.

“If [the traffic offence] is something like speeding, a lot of people are going to think very, very differently about facial recognition,” she said.

“We’ve always had this problem – that they put out in their press releases, ‘We’re only looking serious and violent offenders’, but in the stops we witness that’s very rarely the case. A 31-year-old man being wanted for drug offences? Is that possession of a small amount of marijuana? Do we really need facial recognition for that?”

A necessary and proportionate deployment?

Before it can deploy facial recognition technology, the MPS must meet a number of requirements related to necessity, proportionality and legality.

For example, the MPS’s legal mandate document – which sets out the complex patchwork of legislation the force claims allows it to deploy the technology – says the “authorising officers need to decide the use of LFR is necessary and not just desirable to enable the MPS to achieve its legitimate aim”.

The MPS’s Data Protection Impact Assessment (DPIA) also says that “all images submitted for inclusion on a watchlist must be lawfully held by the MPS”.

In 2012, a High Court ruling found the retention of custody images – which are used as the primary source of watchlists – by the Metropolitan Police to be unlawful, with unconvicted people’s information being kept in the same way as those who were ultimately convicted. It also deemed the minimum six-year retention period to be not proportionate.

Addressing the Parliamentary Science and Technology Committee on 19 March 2019, then-biometrics commissioner Paul Wiles said there was “very poor understanding” of the retention period surrounding custody images across police forces in England and Wales.

“I’m not sure that the legal case [for retention] is strong enough, and I’m not sure that it would withstand a further court challenge,” he said.

Big Brother Watch’s Carlo told Computer Weekly that a police officer on the ground during the Westminster LFR deployment informed her there were 9,500 images on the watchlist for that deployment.

“That’s not a targeted and specified deployment because of a pressing need – it’s a catch net,” she said.

In July 2019, a report from the Human Rights, Big Data & Technology Project based at the University of Essex Human Rights Centre – which marked the first independent review into trials of LFR technology by the Metropolitan Police – highlighted a discernible “presumption to intervene” among police officers using the technology, meaning they tended to trust the outcomes of the system and engage individuals that it said matched the watchlist in use even when they did not.

Computer Weekly contacted the MPS about these issues – including how the force decided the Westminster deployments were necessary, the basis on which the deployment was deemed proportionate, how it had resolved this issue of lawful retention (and whether it could guarantee every custody image in the 28 January watchlists were held lawfully), and how it has dealt with the “presumption to intervene” – but it had still not responded more than a week after the deployment.

Civil society groups react

In response to the Westminster deployment, policy and campaigns director at human rights group Liberty, Emmanuelle Andrews, described facial recognition as “oppressive by design” and said “its inaccuracy and intrusion will fall most hard on people of colour, especially black men who face routine police oppression”.

He added: “The Court of Appeal has agreed that this technology violates our rights and threatens our liberty. Yet the Met has trialled it repeatedly. These tools are neither necessary nor compatible with the type of society we want to live in. To keep everyone safe, we must reject divisive and oppressive surveillance technology, we must reject ever-increasing and unaccountable police powers, and demand that government works with communities to develop strategies based in fairness, participation and support.”

Carlo said the facial recognition van at Oxford Street on 28 January was accompanied by a significant police presence, including around 25 uniformed officers and 25 plainclothes officers.

“The Metropolitan Police have been mired in scandals all year, and there are serious trust issues…to see such a huge deployment of police experimenting with a very intrusive technology whilst also handing out leaflets and having to explain to members of the public why they’re standing there staring at iPads, waiting for match alerts and scanning their faces seems extraordinary and very, very misguided to me,” she said.

Source is ComputerWeekly.com

Vorig artikelBreaking the noise barrier: The startups developing quantum computers
Volgend artikelUmbrella company Parasol confirms data breach linked to cyber attack five weeks ago