General Discussion
Related: Editorials & Other Articles, Issue Forums, Alliance Forums, Region ForumsApple delays plans to scan cloud uploads for child sexual abuse images
Source: The Guardian
Company says it will collect input and make improvements after backlash from privacy groups
Alex Hern Technology editor
@alexhern
Fri 3 Sep 2021 15.36 BST
Apple will delay its plans to begin scanning user images for child sexual abuse material (CSAM) before uploading them to the cloud, the company says, after a backlash from privacy groups.
The companys proposal, first revealed in August, involved a new technique it had developed called perceptual hashing to compare photos with known images of child abuse when users opted to upload them to the cloud. If the company detected enough matches, it would manually review the images, before flagging the user account to law enforcement.
Now, Apple says it is pausing the implementation of the project. Last month we announced plans for features intended to help protect children from predators who use communication tools to recruit and exploit them, and limit the spread of child sexual abuse material, the company said in a statement.
Based on feedback from customers, advocacy groups, researchers and others, we have decided to take additional time over the coming months to collect input and make improvements before releasing these critically important child safety features.
As well as the CSAM scanning, Apple announced and has now paused a second set of updates, which would have seen it using an AI system to identify explicit images sent and received by users under 18 through the companys Messages app and, where those users were under 13 and had their phones managed by family members, warn a parent or guardian.
-snip-
Read more: https://www.theguardian.com/technology/2021/sep/03/apple-delays-plans-to-scan-cloud-uploads-for-child-sexual-abuse-images
Hugh_Lebowski
(33,643 posts)to each other on the regular. My bet is WAAAAAAAY more than any of us old folks on DU can even fathom.
And if you start looking at that, you get into age of consent issues, which vary from state to state ... and from a 'programming' standpoint, this can get extremely complex.
Not to mention, there's surely a lot of pics of very young kids running around naked at the beach out there in the world, that were never meant to be 'perverted' at the time.
Things like this complicate what sounds like something straightforward.
FreeState
(10,570 posts)And it would only trigger an alert when 30 known child porn images have been uploaded. It only targets known images.
Hugh_Lebowski
(33,643 posts)If I'm a 17 yo girl and my 16 year old boyfriend sends a video of his dick, should I be reported to authorities? Note that a video contains 30 'images' every second.
What if I send that video to my little sister, who's 14?
What if she sends it some of her little friends to laugh about?
What if one of their dads gets ahold of their phone, and, in outrage, sends that video to his friends, one of whom is secretly gay and happens to enjoy it?
And he turns around and share it with someone else, for whatever reason.
And on and on and on.
So ... at what point, EXACTLY ... did that video become 'illegal child porn'?
You're talking about a company with well over 1 billion 'customers' world-wide. You HAVE to rely on 'programming' to even begin to 'process' the sheer volume of information you have on your hands.
And the task of programming, I'm saying, is unbelievably complex.
What I'm getting at here, is that they're probably taking more time because ... the problem is not as simple as it sounds like.
FreeState
(10,570 posts)Its checked against. Its not looking at the photos and determining if its child porn. Its looking at the file hash.
https://www.apple.com/child-safety/pdf/CSAM_Detection_Technical_Summary.pdf
CSAM Detection enables Apple to accurately identify and report iCloud users who store known Child Sexual Abuse Material (CSAM) in their iCloud Photos accounts. Apple servers flag accounts exceeding a threshold number of images that match a known database of CSAM image hashes so that Apple can provide relevant information to the National Center for Missing and Exploited Children (NCMEC). This process is secure, and is expressly designed to preserve user privacy.
CSAM Detection provides these privacy and security assurances:
Apple does not learn anything about images that do not match the known CSAM database.
Apple cant access metadata or visual derivatives for matched CSAM images until a threshold of matches is exceeded for an iCloud Photos account.
The risk of the system incorrectly flagging an account is extremely low. In addition, Apple manually reviews all reports made to NCMEC to ensure reporting accuracy.
Users cant access or view the database of known CSAM images.
Users cant identify which images were flagged as CSAM by the system.
For detailed information about the cryptographic protocol and s
Hugh_Lebowski
(33,643 posts)The complexities of 'tracking who did what when' in terms of Apple's potential liability (as their own products may be part of the 'chain of custody', if you will) is not a simple question.
What if the very REASON some set of photos is in that database ... is because a bunch of Apple users shared it among themselves in the first place?