Apple Delays Rollout of Child Safety Features

0
370
Oracle enhances customer experience platform with a B2B refresh

Source is New York Times

Apple said on Friday that it would delay its rollout of child safety measures, which would have allowed it to scan users’ iPhones to detect images of child sexual abuse, after criticism from privacy groups.

The company announced in early August that iPhones would begin using complex technology to spot images of child sexual abuse, commonly known as child pornography, that users uploaded to its iCloud storage service. Apple also said it would let parents turn on a feature that could flag them when their children sent or received nude photos in text messages.

The measures faced strong resistance from computer scientists, privacy groups and civil-liberty lawyers because the features represented the first technology that would allow a company to look at a person’s private data and report it to law enforcement authorities.

“Based on feedback from customers, advocacy groups, researchers and others, we have decided to take additional time over the coming months to collect input and make improvements before releasing these critically important child safety features,” Apple said in statement posted to its website.

The feature would have allowed Apple’s virtual assistant, Siri, to direct people who asked about child sexual abuse to appropriate resources, as well as enable parents to turn on technology that scans images in their children’s text messages for nudity.

The tool that generated the most backlash, however, was a software program that would have scanned users’ iPhone photos and compared them with a database of known child sexual abuse images.

The tech giant announced the changes after reports in The New York Times showed the proliferation of child sexual abuse images online.

Matthew Green, a computer science professor at Johns Hopkins University, said that once the ability to sift through users’ private photos was out there, it would have been ripe for misuse. Governments, for example, could potentially lean on Apple’s technology to help track down dissidents.

Apple argued that it was “going to resist pressure from all governments in the world, including China,” Mr. Green said. “That didn’t seem like a very safe system.”

Apple did not appear to anticipate such a backlash. When the company announced the changes, it sent reporters technical explainers and statements from child-safety groups and computer scientists applauding the effort.

But Mr. Green said the company’s move did not seem to take into account the views of the privacy and child safety communities. “If I could have designed a rollout that was intended to fail, it would have looked like this one,” he said.

What matters, experts said, is what Apple will do now that it has hit pause. Will it cancel the initiative entirely, simply roll out nearly identical features after a delay or find a middle ground?

“We look forward to hearing more about how Apple intends to change or improve its planned capabilities to tackle these problems without undermining end-to-end encryption, privacy and free expression,” Samir Jain, the policy director for the Center for Democracy and Technology, an advocacy group, said in a statement.

Joe Mullin, a policy analyst with the Electronic Frontier Foundation, a digital rights group, said the foundation had a petition with more than 25,000 signatures asking Apple not to introduce the feature. He said that it was “great that they’re taking a moment to think things over,” but that he and other privacy coalitions would continue to plead with Apple to abandon its plan altogether.

Source is New York Times

Vorig artikelThese Two Rumors Are Going Viral Ahead of California’s Recall Election
Volgend artikelFacebook Apologizes After A.I. Puts ‘Primates’ Label on Video of Black Men