Apple to Delay iPhone Update That Could Scan Device for Illegal Content
Source: The Wall Street Journal.
TECH | CONSUMER TECHNOLOGY
Apple to Delay iPhone Update That Could Scan Device for Illegal Content
Tech giant says it will take more time to collect input and make improvements before releasing feature
By Joanna Stern and Tim Higgins
https://twitter.com/JoannaStern
joanna.stern@wsj.com
https://twitter.com/timkhiggins
tim.higgins@wsj.com
Updated Sept. 3, 2021 12:10 pm ET
Apple Inc. is delaying the rollout of tools aimed at combating child pornography on iPhones after sparking concern among privacy advocates that the software could create broader risks for users.
The Cupertino, Calif., tech giant said Friday it would take additional time to make improvements on the plan announced last monththe second time in a year that it has delayed a new privacy feature after an outcry from critics over the potential ramifications.
TO READ THE FULL STORY
SUBSCRIBE
SIGN IN
Read more: https://www.wsj.com/articles/apple-to-delay-iphone-update-that-could-scan-device-for-illegal-content-11630676309
Apple is delaying the rollout of tools aimed at combating child pornography on iPhones after sparking concern among privacy advocates that the software could create broader risks for users
Link to tweet
getagrip_already
(14,741 posts)The media is saying it compares the hash of images against known child porn images, which is kaka.
What they are doing is to use powerful AI algorithms to analyze all of the images on a phone, and then send an encrypted set of metadata back to apple with reults.
The routines can detect images of men, woman, children, animals etc. But it also puts them into context such as "Female, suggestive pose, underwear or bikini". It can also analyze in context of "couple, naked, sex act". Suppose they supply Saudi Arabia with hits on "Male Couple, sex act, naked"?
It can upload any image that meets their criteria without you knowing. They are not a government agency. They are not following any government mandates. They are not being transparent about the criteria they are using to analyze images or who will have access to any images uploaded. They only say they will notify LE after review.
Sure, I trust them, completely... NOT!
NurseJackie
(42,862 posts)... it will always be doubted.
displacedtexan
(15,696 posts)and placed a bar across her bare chest on my daughter's Facebook page. It was both embarrassing and humiliating.
Also, my iPad (Chrome) warns me if any of my passwords have been part of one of those major personal info thefts: Settings>Passwords.
I currently have 7 that have to be changed.
Who's completely certain that the Google and future Apple algorithms are trustworthy? If you answer yes, I hope you've got $125k for a cyber forensic investigation of your images. That's what some guy in Oregon is facing in addition to defense attorneys' fees after being accused of having "child porn" on his computer.
LiberalFighter
(50,906 posts)OneCrazyDiamond
(2,031 posts)I predict they will wait for the noise to quiet down before implementing quietly.
Response to mahatmakanejeeves (Original post)
dalton99a This message was self-deleted by its author.
Ron Obvious
(6,261 posts)I have this with a lot of modern technology: I want it to do what I want it to do, and nothing that I don't want it to do.
This should be self-evident and not raise flags.
Demsrule86
(68,556 posts)Android.
cstanleytech
(26,286 posts)also be my last as this sounds to much like taking the first step down a very slippery slope that they are on and its one I would not be willing to follow them on.
LiberalFighter
(50,906 posts)Private companies should only be doing this with their property. Not their customers.
Law enforcement have the tools to identify the perverts on their own. When they do have probable cause then they can use outside resources and get a warrant. In all likelihood iPhone will make mistakes that will harm innocent people.
dalton99a
(81,466 posts)Why put in a backdoor on a user's phone when they have been scanning everything on their iCloud since 2019?