this post was submitted on 28 Jun 2023
44 points (100.0% liked)

Apple

17349 readers
89 users here now

Welcome

to the largest Apple community on Lemmy. This is the place where we talk about everything Apple, from iOS to the exciting upcoming Apple Vision Pro. Feel free to join the discussion!

Rules:
  1. No NSFW Content
  2. No Hate Speech or Personal Attacks
  3. No Ads / Spamming
    Self promotion is only allowed in the pinned monthly thread

Lemmy Code of Conduct

Communities of Interest:

Apple Hardware
Apple TV
Apple Watch
iPad
iPhone
Mac
Vintage Apple

Apple Software
iOS
iPadOS
macOS
tvOS
watchOS
Shortcuts
Xcode

Community banner courtesy of u/Antsomnia.

founded 1 year ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
[–] generalpotato@lemmy.world 1 points 1 year ago (7 children)

Didn’t Apple try to introduce this and got a ton of flak from all sorts of privacy “experts”? They then scrapped their plans, did they not? How is this any better/different? Any sort of “backdoor” into encryption means that the encryption is compromised. They tackled this in 2014 in the US. Feels like deja vu all over again.

[–] AlexKingstonsGigolo@kbin.social 3 points 1 year ago (4 children)

@generalpotato Ish. I read the technical write up and they actually came up with a very clever privacy-focused way of scanning for child porn.

First, only photos were scanned and only if they were stored in iCloud.

Then, only cryptographic hashes of the photos were collected.

Those hashes were grepped for other cryptographic hashes of known child porn images, images which had to be in databases of multiple non-governmental organizations; so, if an image was only in the database of, say, the National Center For Missing And Exploited Children or only in the database of China's equivalent, its cryptographic hash couldn't be used. This requirement would make it harder for a dictator to slip in a hash to look for dissidents by making it substantially more difficult to get an image in enough databases.

Even then, an Apple employee would have to verify actual child porn was being stored in iCloud only after 20 separate images were flagged. (The odds any innocent person even makes it to this stage incorrectly was estimated to be something like one false positive a year, I think, because of all of the safeguards Apple had.)

Only after an Apple employee confirmed the existence of child porn would the iCloud account be frozen and the relevant non-government organizations alerted.

Honestly, I have a better chance of getting a handjob from Natalie Portman in the next 24 hours than an innocent person being incorrectly reported to any government authority.

[–] MisuseCase@infosec.pub 1 points 1 year ago

It would have worked and it would have protected privacy but most people don't understand the difference between having a hash of known CSAM on your phone and having actual CSAM on your phone for comparison purposes and it freaked people out.

I understand the difference and I'm still uncomfortable with it, not because of the proximity to CSAM but because I don't like the precedent of anyone scanning my encrypted messages. Give them an inch, etc.

load more comments (3 replies)
load more comments (5 replies)