Welcome to DU! The truly grassroots left-of-center political community where regular people, not algorithms, drive the discussions and set the standards. Join the community: Create a free account Support DU (and get rid of ads!): Become a Star Member Latest Breaking News General Discussion The DU Lounge All Forums Issue Forums Culture Forums Alliance Forums Region Forums Support Forums Help & Search

Eugene

(61,823 posts)
Fri Sep 3, 2021, 07:40 PM Sep 2021

Apple delays plans to scan cloud uploads for child sexual abuse images

Source: The Guardian

Apple delays plans to scan cloud uploads for child sexual abuse images

Company says it will ‘collect input and make improvements’ after backlash from privacy groups

Alex Hern Technology editor
@alexhern
Fri 3 Sep 2021 15.36 BST

Apple will delay its plans to begin scanning user images for child sexual abuse material (CSAM) before uploading them to the cloud, the company says, after a backlash from privacy groups.

The company’s proposal, first revealed in August, involved a new technique it had developed called “perceptual hashing” to compare photos with known images of child abuse when users opted to upload them to the cloud. If the company detected enough matches, it would manually review the images, before flagging the user account to law enforcement.

Now, Apple says it is pausing the implementation of the project. “Last month we announced plans for features intended to help protect children from predators who use communication tools to recruit and exploit them, and limit the spread of child sexual abuse material,” the company said in a statement.

“Based on feedback from customers, advocacy groups, researchers and others, we have decided to take additional time over the coming months to collect input and make improvements before releasing these critically important child safety features.”

As well as the CSAM scanning, Apple announced and has now paused a second set of updates, which would have seen it using an AI system to identify explicit images sent and received by users under 18 through the company’s Messages app and, where those users were under 13 and had their phones managed by family members, warn a parent or guardian.

-snip-


Read more: https://www.theguardian.com/technology/2021/sep/03/apple-delays-plans-to-scan-cloud-uploads-for-child-sexual-abuse-images
6 replies = new reply since forum marked as read
Highlight: NoneDon't highlight anything 5 newestHighlight 5 most recent replies
 

Hugh_Lebowski

(33,643 posts)
1. I'm guessing the underlying concern here is that teens send pics and vids of their boobs and junk
Fri Sep 3, 2021, 07:54 PM
Sep 2021

to each other on the regular. My bet is WAAAAAAAY more than any of us old folks on DU can even fathom.

And if you start looking at that, you get into age of consent issues, which vary from state to state ... and from a 'programming' standpoint, this can get extremely complex.

Not to mention, there's surely a lot of pics of very young kids running around naked at the beach out there in the world, that were never meant to be 'perverted' at the time.

Things like this complicate what sounds like something straightforward.

FreeState

(10,570 posts)
2. They scan for known child porn
Fri Sep 3, 2021, 08:19 PM
Sep 2021

And it would only trigger an alert when 30 known child porn images have been uploaded. It only targets known images.

 

Hugh_Lebowski

(33,643 posts)
3. I think you're underestimating the ambiguity inherent in the concept of 'known child porn images'
Fri Sep 3, 2021, 09:06 PM
Sep 2021

If I'm a 17 yo girl and my 16 year old boyfriend sends a video of his dick, should I be reported to authorities? Note that a video contains 30 'images' every second.

What if I send that video to my little sister, who's 14?

What if she sends it some of her little friends to laugh about?

What if one of their dads gets ahold of their phone, and, in outrage, sends that video to his friends, one of whom is secretly gay and happens to enjoy it?

And he turns around and share it with someone else, for whatever reason.

And on and on and on.

So ... at what point, EXACTLY ... did that video become 'illegal child porn'?

You're talking about a company with well over 1 billion 'customers' world-wide. You HAVE to rely on 'programming' to even begin to 'process' the sheer volume of information you have on your hands.

And the task of programming, I'm saying, is unbelievably complex.

What I'm getting at here, is that they're probably taking more time because ... the problem is not as simple as it sounds like.

FreeState

(10,570 posts)
4. Nope - there is a database of known child porn
Fri Sep 3, 2021, 09:09 PM
Sep 2021

It’s checked against. It’s not looking at the photos and determining if it’s child porn. It’s looking at the file hash.

https://www.apple.com/child-safety/pdf/CSAM_Detection_Technical_Summary.pdf


CSAM Detection enables Apple to accurately identify and report iCloud users who store known Child Sexual Abuse Material (CSAM) in their iCloud Photos accounts. Apple servers flag accounts exceeding a threshold number of images that match a known database of CSAM image hashes so that Apple can provide relevant information to the National Center for Missing and Exploited Children (NCMEC). This process is secure, and is expressly designed to preserve user privacy.

CSAM Detection provides these privacy and security assurances:

• Apple does not learn anything about images that do not match the known CSAM database.
• Apple can’t access metadata or visual derivatives for matched CSAM images until a threshold of matches is exceeded for an iCloud Photos account.
• The risk of the system incorrectly flagging an account is extremely low. In addition, Apple manually reviews all reports made to NCMEC to ensure reporting accuracy.
• Users can’t access or view the database of known CSAM images.
• Users can’t identify which images were flagged as CSAM by the system.
For detailed information about the cryptographic protocol and s

 

Hugh_Lebowski

(33,643 posts)
6. You're talking as though this list is static, but images will be added over time, will they not?
Fri Sep 3, 2021, 09:22 PM
Sep 2021

The complexities of 'tracking who did what when' in terms of Apple's potential liability (as their own products may be part of the 'chain of custody', if you will) is not a simple question.

What if the very REASON some set of photos is in that database ... is because a bunch of Apple users shared it among themselves in the first place?

Latest Discussions»General Discussion»Apple delays plans to sca...