this post was submitted on 21 Nov 2024
154 points (97.5% liked)

Technology

59588 readers
3422 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related content.
  3. Be excellent to each another!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, to ask if your bot can be added please contact us.
  9. Check for duplicates before posting, duplicates may be removed

Approved Bots


founded 1 year ago
MODERATORS
 

Today, a prominent child safety organization, Thorn, in partnership with a leading cloud-based AI solutions provider, Hive, announced the release of an AI model designed to flag unknown CSAM at upload. It's the earliest AI technology striving to expose unreported CSAM at scale.

you are viewing a single comment's thread
view the rest of the comments
[โ€“] Railcar8095@lemm.ee 1 points 1 day ago (1 children)

Applying GAN won't work. If used for filtering would result on results being skewed to a younger, but it won't show 9 the body of a 9 year old unless the model could do that from the beginning.

If used to "tune" the original model, it will result on massive hallucination and aberrations that can result in false positives.

In both cases, decent results will be rare and time consuming. Anybody with the dedication to attempt this already has pictures and can build their own model.

Source: I'm a data scientist

At least it's not "Source: I am a pedophile" lol