Are Apple’s Tools Against Child Abuse Bad for Your Privacy?

0
388
Oracle enhances customer experience platform with a B2B refresh

Source is New York Times

Law enforcement officials, child-safety groups, abuse survivors and some computer scientists praised the moves. In statements provided by Apple, the president of the National Center for Missing and Exploited Children called it a “game changer,” while David Forsyth, chairman of computer science at the University of Illinois at Urbana-Champaign, said that the technology would catch child abusers and that “harmless users should experience minimal to no loss of privacy.”

But other computer scientists, as well as privacy groups and civil-liberty lawyers, immediately condemned the approach.

Other tech companies, like Facebook, Google and Microsoft, also scan users’ photos to look for child sexual abuse, but they do so only on images that are on the companies’ computer servers. In Apple’s case, much of the scanning happens directly on people’s iPhones. (Apple said it would scan photos that users had chosen to upload to its iCloud storage service, but scanning still happens on the phone.)

To many technologists, Apple has opened a Pandora’s box. The tool would be the first technology built into a phone’s operating system that can look at a person’s private data and report it to law enforcement authorities. Privacy groups and security experts are worried that governments looking for criminals, opponents or other targets could find plenty of ways to use such a system.

“As we now understand it, I’m not so worried about Apple’s specific implementation being abused,” said Alex Stamos, a Stanford University researcher who previously led Facebook’s cybersecurity efforts. “The problem is, they’ve now opened the door to a class of surveillance that was never open before.”

If governments had previously asked Apple to analyze people’s photos, the company could have responded that it couldn’t. Now that it has built a system that can, Apple must argue that it won’t.

“I think Apple has clearly tried to do this as responsibly as possible, but the fact they’re doing it at all is the problem,” Ms. Galperin said. “Once you build a system that can be aimed at any database, you will be asked to aim the system at a database.”

Source is New York Times

Vorig artikelPolitico Is Looking for a $1 Billion Deal with Axel Springer
Volgend artikelSecurity Think Tank: Data privacy and ethics in a post-Covid world