After objections about privacy rights, Apple said Friday it will delay its plan to scan users' photo libraries for images of child exploitation.
"Last month we announced plans for features intended to help protect children from predators who use communication tools to recruit and exploit them, and limit the spread of Child Sexual Abuse Material," the company said in a statement. "Based on feedback from customers, advocacy groups, researchers and others, we have decided to take additional time over the coming months to collect input and make improvements before releasing these critically important child safety features."
Apple shares were down slightly Friday morning.
Apple immediately stirred controversy after announcing its system for checking users' devices for illegal child sex abuse material. Critics pointed out that the system, which can check images stored in an iCloud account against a database of known "CSAM" imagery, was at odds with Apple's messaging around its customers' privacy.
The system does not scan a user's photos, but instead looks for known digital "fingerprints" that it matches against the CSAM database. If the system detects enough images on a user's account, it is then flagged to a human monitor who can confirm the imagery and pass the information along to law enforcement if necessary.
Apple's CSAM detection system was supposed to go live for customers this year. It's unclear how long Apple will delay its release following Friday's announcement.
Despite the concerns about Apple's plan, it's actually a standard practice among technology companies. Facebook, Dropbox, Google and many others have systems that can automatically detect CSAM uploaded to their respective services.
https://ift.tt/3DJwvKX
Technology
No comments:
Post a Comment