Rechercher dans ce blog

Saturday, September 4, 2021

Apple delays controversial 'child safety' feature after privacy outcry - New York Post

Apple bowed to privacy and civil liberties advocates Friday when it agreed to delay and modify a controversial plan to scan users’ photos for child pornography.  

The company’s tool, called “neuralMatch,” would scan images on Apple users’ devices before they’re uploaded to iCloud. A separate tool would sift through users’ encrypted messages for child pornography. 

After Apple announced the effort in August, privacy advocates hit back at the company.

The Electronic Frontier Foundation, a digital privacy group, racked up more than 25,000 signatures on a petition against the tool, while the American Civil Liberties Union said in a letter that the tool would “censor protected speech, threaten the privacy and security of people around the world, and have disastrous consequences for many children.” 

Critics say the tool could easily be misused by repressive governments to track and punish users for all kinds of content besides child pornography, including political content. 

And some privacy activists have pointed to Apple’s seemingly accommodating relationship with the government of China, where the vast majority of its devices are manufactured, as evidence that the company would allow the tool to be used for political repression.

In a call with reporters prior to Friday’s announcement, an Apple representative was asked whether the company would exit the Chinese market if authorities demanded the company to use the scanning tool for other purposes. The Apple representative replied that such a decision would be “above their pay grade,” Vice reported.

In Friday’s announcement, Apple did not provide specifics on how it would change its child protection features, but acknowledged the backlash. 

Apple has said that the tool will only flag images that are already in a database of known child pornography.
Apple has said that the tool will only flag images that are already in a database of known child pornography.
Apple

“Last month we announced plans for features intended to help protect children from predators who use communication tools to recruit and exploit them, and limit the spread of Child Sexual Abuse Material,” Apple said in a statement to multiple media outlets. “Based on feedback from customers, advocacy groups, researchers and others, we have decided to take additional time over the coming months to collect input and make improvements before releasing these critically important child safety features.”

The feature was originally scheduled to be rolled out this year. It’s now unclear when the company plans to release the features or how they’ll be changed.

Apple has said that the tool will flag only images that are already in a database of known child pornography, meaning parents who take photos of their children bathing would not be flagged, for example. 

Another one of Apple’s features would have scanned images sent to minors through iMessage for porn, blurring such images and sending a warning to the child.

Johns Hopkins University cybersecurity researcher Matthew Green, a critic of Apple’s features, said Friday’s move “looks promising.” 

“Talk to the technical and policy communities before you do whatever you’re going to do,” Green wrote in a Twitter thread addressing Apple. “Talk to the general public as well. This isn’t a fancy new Touchbar: it’s a privacy compromise that affects 1 [billion] users.” 

Electronic Frontier Foundation Executive Director Cindy Cohn said in a statement to The Post that Friday’s delay, while welcome, doesn’t go far enough. 

“The company must go further than just listening and drop its plans to put a backdoor into its encryption entirely,” Cohn said. “These features would create an enormous danger to iPhone users’ privacy and security, offering authoritarian governments a turnkey mass surveillance system to spy on citizens.”

“The enormous coalition that has spoken out will continue to demand that user phones — both their messages and their photos — be protected, and that the company maintain its promise to provide real privacy to its users,” Cohn added.

Groups like the Electric Frontier Foundation and American Civil Liberties Union condemned Apple's move.
Groups like the Electric Frontier Foundation and American Civil Liberties Union condemned Apple’s move.
Apple

Adblock test (Why?)

Article From & Read More ( Apple delays controversial 'child safety' feature after privacy outcry - New York Post )
https://ift.tt/2YhpNvl
Technology

No comments:

Post a Comment

Search

Featured Post

Microsoft wins battle with Sony as UK reverses finding on Activision merger - Ars Technica

Enlarge / Sony's PlayStation 5. Sony UK regulators reviewing Microsoft's proposed acquisition of Activision Blizzard reverse...

Postingan Populer