this post was submitted on 17 Sep 2023
81 points (100.0% liked)
Gaming
30541 readers
137 users here now
From video gaming to card games and stuff in between, if it's gaming you can probably discuss it here!
Please Note: Gaming memes are permitted to be posted on Meme Mondays, but will otherwise be removed in an effort to allow other discussions to take place.
See also Gaming's sister community Tabletop Gaming.
This community's icon was made by Aaron Schneider, under the CC-BY-NC-SA 4.0 license.
founded 2 years ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
This is the beginning of the end friend.
People who use AI will create a better cheaper product and at the end of the day its use as a new technology is justified. You'll be clinging to an ever smaller raft and eventually have to abandon your ideals.
And at the end of the day art is not stolen when used to train a machine. Copyright itself is an artificial legal construct, and it's the right to redistribute, not the right to learn from art. You can't invent rights out of thin air and get any when they're broken
i feel like this assumes that there will still be human produced art to train on to improve the genAI model when there isnt any incentive for humans to spend so much time to learn to make art when it can be used for training and when machines can churn out pieces at a faster cheaper rate
from section 2ciii of OpenAI's Terms of Use somehow while its justifiable for corporations to use human produced work to train a machine that competes with humans, using corporate machine produced work to train a competing machine is not
Understand that this is not an IP right that OpenAI is defining and promising enforcement of, but simply a contracted obligation. As it currently stands in the US, there is no property right in the outputs of a generative model (like a gpt or sd).
yes but it comes off as really hypocritical of companies putting that in their Terms because they know rival genAI models could train on their output data to undercut them the same way they trained freely off of human's data to undercut humans. and somehow its only ok if theyre the one benefiting from it because they have a bigger team of lawyers