this post was submitted on 20 Jul 2023
1831 points (98.3% liked)

Technology

59533 readers
3678 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related content.
  3. Be excellent to each another!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, to ask if your bot can be added please contact us.
  9. Check for duplicates before posting, duplicates may be removed

Approved Bots


founded 1 year ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
[–] catastrophicblues@lemmy.ca 1 points 1 year ago (1 children)

Yup. They have had issues (think CSAM scandal), but they’re slowly earning back my trust. I’m still a bit wary, but for big tech they have a pretty good track record.

[–] Whirlybird@aussie.zone 15 points 1 year ago (1 children)

They have had issues (think CSAM scandal)

People like you that think that was a "scandal" are half the problem though.

What they were doing with the on-device CSAM scanning as part of the upload to iCloud only was actually good for your privacy. It enabled them to comply with any current and future CSAM laws while protecting your privacy by doing the scanning on your device. It meant that they could then add E2E encryption to iCloud (and then iMessage as well) while still complying with CSAM laws. The alternative - and what everyone else does including google, microsoft, imgur, dropbox, etc - is doing the CSAM scanning in the cloud after you've uploaded it completely insecurely, requiring the data to be stored unencrypted and visible to those companies (and the government).

Doing it on device should have been applauded, but it was attacked by people that didn't understand how it's actually better for them. There was so much misinformation thrown around - that it would scan all of your photos and files as soon as they were created and then instantly report to the police if you took a photo of your infant in the bath, for example, or that it would be used by governments to identify people who have memes saved that they don't like, which is absurd because that's not how the CSAM databases work.

Apples proposed CSAM scanning was literally the best for privacy in the entire industry, and people created such an outrage over it that they basically went "oh well, we'll just do what everyone else is doing which is far more insecure and worse for privacy" and everyone congratulated themselves lol

[–] catastrophicblues@lemmy.ca 2 points 1 year ago

You make a good point. I guess the outrage was more about scanning at all, though I suppose that’s not on Apple.