Apple unveils plans to scan US iPhones for images of child sex abuse
Source: The Hill
Apple will roll out an update later this year that will include technology in iPhones and iPads that allows the tech giant to detect images of child sexual abuse stored in iCloud, the company announced Thursday.
The feature is part of a series of updates Apple unveiled aimed at increasing child safety, but security researchers and advocates are warning the scanning update along with one that aims to give parents protective tools in childrens messages could pose data and security risks beyond the intended purpose.
With the new scanning feature, Apple will be able to report detected child sexual abuse material to the National Center for Missing and Exploited Children (NCMEC) which acts as a comprehensive reporting center and works in collaboration with law enforcement agencies across the country. The company will also disable users accounts if the abusive content is found, Apple said in the update.
Matthew Green, a security professor at Johns Hopkins University, told the Times This will break the dam governments will demand it from everyone.
Read more: https://thehill.com/policy/technology/566603-apple-unveils-plans-to-scan-us-iphones-for-images-of-child-sex-abuse
getagrip_already
(14,816 posts)Mawspam2
(738 posts)...unless required by work. When I do, I always keep the camera lenz covered with electrical tape just for shit like this.
Locrian
(4,522 posts)that they are using child abuse to "justify" the abuse of privacy.
I H.A.T.E. Apple.
dalton99a
(81,566 posts)he started with a notorious racist murderer
Journeyman
(15,037 posts)I know of no one who doesn't wish to see child pornographers caught and dealt harshly with, but too often the new methods of detection and apprehension require a collective loss of privacy and fourth amendment guarantees.
Mysterian
(4,589 posts)I wish the Fourth Amendment was as sacred as the Second.
Girard442
(6,082 posts)They supposedly reported any sexually exploitative images of children, but when the reporter queried company management about the criteria for screening images, they clammed up. Basically, no one could be sure what kind of kid pix could get photographers in legal trouble.
The situation hasn't changed. Could a parent end up in court for a picture of a two-year-old's bare nipple?
And how would this even work? Apple reports troubling material to NMEC who does what? Reports their finding to local law enforcement who breaks down doors? Or just splashes a scarlet M for "molester" on their (virtual) doors and ruins their lives?
Sgent
(5,857 posts)through a formula that yields a number, and then sees if that number matches their database of numbers from the NCMEC. If it does they forward your info to them.
Girard442
(6,082 posts)Sgent
(5,857 posts)AI is used in their other product they announced, but this is a matching function.
Kablooie
(18,637 posts)The photo developer reported her.
keithbvadu2
(36,869 posts)Next year they will be looking for visits to certain web sites... such as DU?
Depending on which party is in office and on the Supreme Court.
marble falls
(57,145 posts)Hugh_Lebowski
(33,643 posts)And perhaps even end up shunned by my friends here.
I am happy to discover I don't, based on what's above.
I have ZERO ISSUE with Apple deciding to DELETE any pictures on their 'cloud storage' that are clearly illegal, to be clear. They shouldn't be expected to absorb that liability.
But actually 'scanning phones', the hardware people own, if that's what's being discussed, and then actually reporting people to the authorities?
I'm not sure I'm comfortable, despite the apparent value.
Earth-shine
(4,044 posts)presumably, while they are in transit.
It does not say they will scan the actual phones.
So, if one does backups to ICloud, your content will be scanned.
Does anyone here actually think Apple is not already scanning your data? What about the Google Drive? Microsoft one-drive? Carbonite and other cloud backups?
They scan everything in their possession.
Right now, they look for viruses, illegal software, and other potential problems in your backups. With a court order, they'll do a deep scan and pull out your individual files.
Google scans every picture you upload. If Apple is not already doing it, they will soon.
Hey there, Hugh. It's a brave new world.
dalton99a
(81,566 posts)The iPhone operating system will soon store a database of hashes of known child sexual abuse material provided by organizations like the National Center for Missing & Exploited Children, and it will run those hashes against the hashes of each photo in a users iCloud to see if there is a match.
Once there are a certain number of matches, the photos will be shown to an Apple employee to ensure they are indeed images of child sexual abuse. If so, they will be forwarded to the National Center for Missing & Exploited Children, and the users iCloud account will be locked.
https://www.nytimes.com/2021/08/05/technology/apple-iphones-privacy.html
Girard442
(6,082 posts)Now it's on your device. You've been tagged as a perv. All your stuff is fair game now.
Response to Girard442 (Reply #19)
ExTex This message was self-deleted by its author.
dalton99a
(81,566 posts)Separately, Apple plans to scan users' encrypted messages for sexually explicit content as a child safety measure, which also alarmed privacy advocates.
The detection system will only flag images that are already in the center's database of known child pornography. Parents snapping innocent photos of a child in the bath presumably need not worry. But researchers say the matching tool which doesn't "see" such images, just mathematical "fingerprints" that represent them could be put to more nefarious purposes.
Matthew Green, a top cryptography researcher at Johns Hopkins University, warned that the system could be used to frame innocent people by sending them seemingly innocuous images designed to trigger matches for child pornography. That could fool Apple's algorithm and alert law enforcement. "Researchers have been able to do this pretty easily," he said of the ability to trick such systems.
Other abuses could include government surveillance of dissidents or protesters. "What happens when the Chinese government says, 'Here is a list of files that we want you to scan for,'" Green asked. "Does Apple say no? I hope they say no, but their technology won't say no."
https://www.npr.org/2021/08/06/1025402725/apple-iphone-for-child-sexual-abuse-privacy
Sgent
(5,857 posts)discussion: https://arstechnica.com/tech-policy/2021/08/apple-explains-how-iphones-will-scan-photos-for-child-sexual-abuse-images/
A lot of people aren't going to like this, but I'm actually in favor of a little more policing of the internet.
DVRacer
(707 posts)Who is going to say something about catching kiddie porn? The problem is we should know that they wont stop there. Once the idea of scanning your account for illicit activities becomes normalized it will expand always does. Next it will be drugs or firearms this is an expansion of the police state. The police could never get the ability to go through your photos via a warrant without cause but your 4th amendment rights do not include corporations.
dalton99a
(81,566 posts)Lokilooney
(322 posts)Demovictory9
(32,468 posts)Mysterian
(4,589 posts)Ahhhhhhh......I remember the old days too...back when we still had a Fourth Amendment. The "war" on some drugs put that on life support.
dalton99a
(81,566 posts)A monorail train displaying Google signage moves past a billboard advertising Apple iPhone security during the 2019 Consumer Electronics Show (CES) in Las Vegas, Nevada, U.S., on Monday, Jan. 7, 2019. Bloomberg | Getty Images
TexasBushwhacker
(20,209 posts)in the cloud, attached to innocent people's IP addresses and then offer to remove it for a price?