this post was submitted on 08 Dec 2023
395 points (93.2% liked)

Technology

59772 readers
4047 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related content.
  3. Be excellent to each another!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, to ask if your bot can be added please contact us.
  9. Check for duplicates before posting, duplicates may be removed

Approved Bots


founded 2 years ago
MODERATORS
 

‘Nudify’ Apps That Use AI to ‘Undress’ Women in Photos Are Soaring in Popularity::It’s part of a worrying trend of non-consensual “deepfake” pornography being developed and distributed because of advances in artificial intelligence.

you are viewing a single comment's thread
view the rest of the comments
[–] Daxtron2@startrek.website 6 points 1 year ago (1 children)

The same way that photo shopping someone's face onto a pornstar's body is.

[–] cosmicrookie@lemmy.world 3 points 1 year ago (2 children)

But its not. That is not legal.

I dont know if it is where you live, but here (Scandinavian Country) and many other places around the World, it is illigal to create fske nudes of people without their permission

[–] Daxtron2@startrek.website 5 points 1 year ago (1 children)

Ah didn't know that, AFAIK it's protected artistic speech in the US. Not to say that it's right but that's probably why it's still a thing.

[–] barsoap@lemm.ee 2 points 1 year ago

In principle that's the case in Germany, too, but only if the person is of public interest (otherwise you're not supposed to publish any pictures of them where they are the focus of the image) and, secondly, it has to serve actually discernible satire, commentary, etc. Merely saying "I'm an artist and that's art" doesn't fly, hire a model. Similar to how you can dish out a hell a lot of insults when you're doing a pointed critique, but if the critique is missing and it's only abuse that doesn't fly.

Ha. Idea: An AfD politician as a garden gnome peeing into the Bundestag.

[–] TotallynotJessica@lemmy.world 2 points 1 year ago (2 children)

Appreciate how good you have it. In America, child sex abuse material is only illegal when children were abused in making it, or if it's considered obscene by a community. If someone edits adult actors to look like children as they perform sex acts, it's not illegal under federal law. If someone generates child nudity using ai models trained on nude adults and only clothed kids, it's not illegal at the national level.

Fake porn of real people could be banned for being obscene, usually at a local level, but almost any porn could be banned by lawmakers this way. Harmless stuff like gay or trans porn could be banned by bigoted lawmakers, because obscenity is a fairly subjective mechanism. However, because of our near absolute freedom of speech, obscenity is basically all we have to regulate malicious porn.

[–] cosmicrookie@lemmy.world 1 points 1 year ago* (last edited 1 year ago)

The way I believe it is here, is that it is illigal to distribute porn or nudes without consent, be it real or fake. I don't know how it is with AI generated material of purely imaginary people. I don't think that that is iligal. but if it is made to look like someone particular, then you can get sued.

[–] CaptainEffort@sh.itjust.works 1 points 1 year ago

child sex abuse material is only illegal when children were abused in making it

This is literally why it’s illegal though. Because children are abused, permanently traumatized, or even killed in its making. Not because it disgusts us.

There are loads of things that make me want to be sick, but unless they actively hurt someone they shouldn’t be illegal.