this post was submitted on 08 Dec 2024
11 points (92.3% liked)

Legal News

274 readers
262 users here now

International and local legal news.


Basic rules

1. English onlyTitle and associated content has to be in English.
2. Sensitive topics need NSFW flagSome cases involve sensitive topics. Use common sense and if you think that the content might trigger someone, post it under NSFW flag.
3. Instance rules applyAll lemmy.zip instance rules listed in the sidebar will be enforced.


Icon attribution | Banner attribution

founded 8 months ago
MODERATORS
 

Apple is being sued over its decision not to implement a system that would have scanned iCloud photos for child sexual abuse material (CSAM).

top 1 comments
sorted by: hot top controversial new old
[–] remotelove@lemmy.ca 3 points 2 weeks ago

Yes, CSAM is bad.

However, false positives from scans also have the possibility to destroy lives. While I wouldn't cry about Apple losing millions in false-postive related lawsuits, it's simply not a good thing in this case.