this post was submitted on 20 Sep 2023
549 points (95.5% liked)

Technology

59237 readers
3524 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related content.
  3. Be excellent to each another!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, to ask if your bot can be added please contact us.
  9. Check for duplicates before posting, duplicates may be removed

Approved Bots


founded 1 year ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
[โ€“] HipPriest@kbin.social 50 points 1 year ago (1 children)

I mean this isn't miles away from what the writer's strike is about. Certainly I think the technology is great but after the last round of tech companies turning out to be fuckwits (Facebook, Google etc) it's only natural that people are going to want to make sure this stuff is properly regulated and run fairly (not at the expense of human creatives).

[โ€“] archomrade@midwest.social 2 points 1 year ago

As it stands now, I actually think it is miles away.

Studio's were raking in huge profits from digital residuals that weren't being passed to creatives, but AI models aren't currently paying copyright holders anything. If they suddenly did start paying publishers for use, it would almost certainly exclude payment to the actual creatives.

I'd also point out that LLM's aren't like digital distribution models because LLM's aren't distributing copyrighted works, at best you can say they're distributing a very lossy (to use someone else's term) compressed alternative that has to be pieced back together manually if you really wanted to extract it.

No argument that AI should be properly regulated, but I don't think copyright is the right framework to be doing it.