this post was submitted on 09 Jan 2024
525 points (98.2% liked)

Technology

59568 readers
4146 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related content.
  3. Be excellent to each another!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, to ask if your bot can be added please contact us.
  9. Check for duplicates before posting, duplicates may be removed

Approved Bots


founded 1 year ago
MODERATORS
 

‘Impossible’ to create AI tools like ChatGPT without copyrighted material, OpenAI says::Pressure grows on artificial intelligence firms over the content used to train their products

top 50 comments
sorted by: hot top controversial new old
[–] hellothere@sh.itjust.works 172 points 10 months ago (15 children)

OK, so pay for it.

Pretty simple really.

[–] bjoern_tantau@swg-empire.de 128 points 10 months ago (3 children)

Or let's use this opportunity to make copyright much less draconian.

[–] dhork@lemmy.world 85 points 10 months ago* (last edited 10 months ago) (40 children)

¿Porque no los dos?

I don't understand why people are defending AI companies sucking up all human knowledge by saying "well, yeah, copyrights are too long anyway".

Even if we went back to the pre-1976 term of 28 years, renewable once for a total of 56 years, there's still a ton of recent works that AI are using without any compensation to their creators.

I think it's because people are taking this "intelligence" metaphor a bit too far and think if we restrict how the AI uses copyrighted works, that would restrict how humans use them too. But AI isn't human, it's just a glorified search engine. At least all standard search engines do is return a link to the actual content. These AI models chew up the content and spit out something based on it. It simply makes sense that this new process should be licensed separately, and I don't care if it makes some AI companies go bankrupt. Maybe they can work adequate payment for content into their business model going forward.

[–] deweydecibel@lemmy.world 23 points 10 months ago* (last edited 10 months ago) (2 children)

It shouldn't be cheap to absorb and regurgitate the works of humans the world over in an effort to replace those humans and subsequently enrich a handful of silicon valley people.

Like, I don't care what you think about copyright law and how corporations abuse it, AI itself is corporate abuse.

And unlike copyright, which does serve its intended purpose of helping small time creators as much as it helps Disney, the true benefits of AI are overwhelmingly for corporations and investors. If our draconian copyright system is the best tool we have to combat that, good. It's absolutely the lesser of the two evils.

load more comments (2 replies)
load more comments (39 replies)
[–] hellothere@sh.itjust.works 37 points 10 months ago* (last edited 10 months ago) (12 children)

I'm no fan of the current copyright law - the Statute of Anne was much better - but let's not kid ourselves that some of the richest companies in the world have any desire what so ever to change it.

load more comments (12 replies)
[–] Fisk400@feddit.nu 13 points 10 months ago

As long as capitalism exist in society, just being able go yoink and taking everyone's art will never be a practical rule set.

load more comments (14 replies)
[–] flop_leash_973@lemmy.world 76 points 10 months ago* (last edited 10 months ago) (13 children)

If it ends up being OK for a company like OpenAI to commit copyright infringement to train their AI models it should be OK for John/Jane Doe to pirate software for private use.

But that would never happen. Almost like the whole of copyright has been perverted into a scam.

load more comments (13 replies)
[–] KingThrillgore@lemmy.ml 53 points 10 months ago (2 children)

Its almost like we had a thing where copyrighted things used to end up but they extended the dates because money

[–] Ultraviolet@lemmy.world 18 points 10 months ago

This is where they have the leverage to push for actual copyright reform, but they won't. Far more profitable to keep the system broken for everyone but have an exemption for AI megacorps.

[–] rivermonster@lemmy.world 17 points 10 months ago

I was literally about to come in here and say it would be an interesting tangential conversation to talk about how FUCKED copyright laws are, and how relevant to the discussion it would be.

More upvote for you!

[–] 800XL@lemmy.world 50 points 10 months ago

I guess the lesson here is pirate everything under the sun and as long as you establish a company and train a bot everything is a-ok. I wish we knew this when everyone was getting dinged for torrenting The Hurt Locker back when.

Remember when the RIAA got caught with pirated mp3s and nothing happened?

What a stupid timeline.

[–] Milk_Sheikh@lemm.ee 41 points 10 months ago (13 children)

Wow! You’re telling me that onerous and crony copyright laws stifle innovation and creativity? Thanks for solving the mystery guys, we never knew that!

load more comments (13 replies)
[–] reverendsteveii@lemm.ee 39 points 10 months ago (3 children)

if it's impossible for you to have something without breaking the law you have to do without it

if it's impossible for the artistocrat class to have something without breaking the law, we change or ignore the law

load more comments (3 replies)
[–] dutchkimble@lemy.lol 35 points 10 months ago

Cool, don't do it then

[–] unreasonabro@lemmy.world 32 points 10 months ago (3 children)

finally capitalism will notice how many times it has shot up its own foot with their ridiculous, greedy infinite copyright scheme

As a musician, people not involved in the making of my music make all my money nowadays instead of me anyway. burn it all down

load more comments (3 replies)
[–] kibiz0r@lemmy.world 28 points 10 months ago (4 children)

I'm dumbfounded that any Lemmy user supports OpenAI in this.

We're mostly refugees from Reddit, right?

Reddit invited us to make stuff and share it with our peers, and that was great. Some posts were just links to the content's real home: Youtube, a random Wordpress blog, a Github project, or whatever. The post text, the comments, and the replies only lived on Reddit. That wasn't a huge problem, because that's the part that was specific to Reddit. And besides, there were plenty of third-party apps to interact with those bits of content however you wanted to.

But as Reddit started to dominate Google search results, it displaced results that might have linked to the "real home" of that content. And Reddit realized a tremendous opportunity: They now had a chokehold on not just user comments and text posts, but anything that people dare to promote online.

At the same time, Reddit slowly moved from a place where something may get posted by the author of the original thing to a place where you'll only see the post if it came from a high-karma user or bot. Mutated or distorted copies of the original instance, reformated to cut through the noise and gain the favor of the algorithm. Re-posts of re-posts, with no reference back to the original, divorced of whatever context or commentary the original creator may have provided. No way for the audience to respond to the author in any meaningful way and start a dialogue.

This is a miniature preview of the future brought to you by LLM vendors. A monetized portal to a dead internet. A one-way street. An incestuous ouroborous of re-posts of re-posts. Automated remixes of automated remixes.

--

There are genuine problems with copyright law. Don't get me wrong. Perhaps the most glaring problem is the fact that many prominent creators don't even own the copyright to the stuff they make. It was invented to protect creators, but in practice this "protection" gets assigned to a publisher immediately after the protected work comes into being.

And then that copyright -- the very same thing that was intended to protect creators -- is used as a weapon against the creator and against their audience. Publishers insert a copyright chokepoint in-between the two, and they squeeze as hard as they desire, wringing it of every drop of profit, keeping creators and audiences far away from each other. Creators can't speak out of turn. Fans can't remix their favorite content and share it back to the community.

This is a dysfunctional system. Audiences are denied the ability to access information or participate in culture if they can't pay for admission. Creators are underpaid, and their creative ambitions are redirected to what's popular. We end up with an auto-tuned culture -- insular, uncritical, and predictable. Creativity reduced to a product.

But.

If the problem is that copyright law has severed the connection between creator and audience in order to set up a toll booth along the way, then we won't solve it by giving OpenAI a free pass to do the exact same thing at massive scale.

load more comments (4 replies)
[–] ook_the_librarian@lemmy.world 27 points 10 months ago (3 children)

It's not "impossible". It's expensive and will take years to produce material under an encompassing license in the quantity needed to make the model "large". Their argument is basically "but we can have it quickly if you allow legal shortcuts."

load more comments (3 replies)
[–] Blackmist 26 points 10 months ago (8 children)

Maybe you shouldn't have done it then.

I can't make a Jellyfin server full of content without copyrighted material either, but the key difference here is I'm not then trying to sell that to investors.

load more comments (8 replies)
[–] wosat@lemmy.world 22 points 10 months ago (1 children)

This situation seems analogous to when air travel started to take off (pun intended) and existing legal notions of property rights had to be adjusted. IIRC, a farmer sued an airline for trespassing because they were flying over his land. The court ruled against the farmer because to do otherwise would have killed the airline industry.

load more comments (1 replies)
[–] whoisearth@lemmy.ca 22 points 10 months ago (6 children)

If OpenAI is right (I think they are) one of two things need to happen.

  1. All AI should be open source and non-profit
  2. Copywrite law needs to be abolished

For number 1. Good luck for all the reasons we all know. Capitalism must continue to operate.

For number 1. Good luck because those in power are mostly there off the backs of those before them (see Disney, Apple, Microsoft, etc)

Anyways, fun to watch play out.

[–] SCB@lemmy.world 15 points 10 months ago* (last edited 10 months ago) (12 children)

There's a third solution you're overlooking.

3: OpenAI (or other) wins a judgment that AI content is not inherently a violation of copyright regardless of materials it is trained upon.

load more comments (12 replies)
load more comments (5 replies)
[–] McArthur@lemmy.world 20 points 10 months ago (5 children)

It feels to be like every other post on lemmy is taking about how copyright is bad and should be changed, or piracy is caused by fragmentation and difficulty accessing information (streaming sites). Then whenever this topic comes up everyone completely flips. But in my mind all this would do is fragment the ai market much like streaming services (suddenly you have 10 different models with different licenses), and make it harder for non mega corps without infinite money to fund their own llms (of good quality).

Like seriously, can't we just stay consistent and keep saying copyright bad even in this case? It's not really an ai problem that jobs are effected, just a capitalism problem. Throw in some good social safety nets and tax these big ai companies and we wouldn't even have to worry about the artist's well-being.

[–] HiddenLayer5@lemmy.ml 18 points 10 months ago* (last edited 10 months ago) (1 children)

I think looking at copyright in a vacuum is unhelpful because it's only one part of the problem. IMO, the reason people are okay with piracy of name brand media but are not okay with OpenAI using human-created artwork is from the same logic of not liking companies and capitalism in general. People don't like the fact that AI is extracting value from individual artists to make the rich even richer while not giving anything in return to the individual artists, in the same way we object to massive and extremely profitable media companies paying their artists peanuts. It's also extremely hypocritical that the government and by extention "copyright" seems to care much more that OpenAI is using name brand media than it cares about OpenAI scraping the internet for independent artists' work.

Something else to consider is that AI is also undermining copyleft licenses. We saw this in the GitHub Autopilot AI, a 100% proprietary product, but was trained on all of GitHub's user-generated code, including GPL and other copyleft licensed code. The art equivalent would be CC-BY-SA licenses where derivatives have to also be creative commons.

load more comments (1 replies)
load more comments (4 replies)
[–] dasgoat@lemmy.world 20 points 10 months ago (1 children)
[–] NeatNit@discuss.tchncs.de 26 points 10 months ago (7 children)

hijacking this comment

OpenAI was IMHO well within its rights to use copyrighted materials when it was just doing research. They were* doing research on how far large language models can be pushed, where's the ceiling for that. It's genuinely good research, and if copyrighted works are used just to research and what gets published is the findings of the experiments, that's perfectly okay in my book - and, I think, in the law as well. In this case, the LLM is an intermediate step, and the published research papers are the "product".

The unacceptable turning point is when they took all the intermediate results of that research and flipped them into a product. That's not the same, and most or all of us here can agree - this isn't okay, and it's probably illegal.

* disclaimer: I'm half-remembering things I've heard a long time ago, so even if I phrase things definitively I might be wrong

load more comments (7 replies)
[–] S410@lemmy.ml 19 points 10 months ago (19 children)

They're not wrong, though?

Almost all information that currently exists has been created in the last century or so. Only a fraction of all that information is available to be legally acquired for use and only a fraction of that already small fraction has been explicitly licensed using permissive licenses.

Things that we don't even think about as "protected works" are in fact just that. Doesn't matter what it is: napkin doodles, writings on bathrooms stall walls, letters written to friends and family. All of those things are protected, unless stated otherwise. And, I don't know about you, but I've never seen a license notice attached to a napkin doodle.

Now, imagine trying to raise a child while avoiding every piece of information like that; information that you aren't licensed to use. You wouldn't end up with a person well suited to exist in the world. They'd lack education regarding science, technology, they'd lack understanding of pop-culture, they'd know no brand names, etc.

Machine learning models are similar. You can train them that way, sure, but they'd be basically useless for real-world applications.

[–] AntY@lemmy.world 52 points 10 months ago (18 children)

The main difference between the two in your analogy, that has great bearing on this particular problem, is that the machine learning model is a product that is to be monetized.

load more comments (18 replies)
load more comments (18 replies)
[–] Evotech@lemmy.world 19 points 10 months ago* (last edited 10 months ago) (10 children)
load more comments (10 replies)
[–] Chee_Koala@lemmy.world 19 points 10 months ago (2 children)

But our current copyright model is so robust and fair! They will only have to wait 95y after the author died, which is a completely normal period.

If you want to control your creations, you are completely free to NOT publish it. Nowhere it's stated that to be valuable or beautiful, it has to be shared on the world podium.

We'll have a very restrictive Copyright for non globally transmitted/published works, and one for where the owner of the copyright DID choose to broadcast those works globally. They have a couple years to cash in, and then after I dunno, 5 years, we can all use the work as we see fit. If you use mass media to broadcast creative works but then become mad when the public transforms or remixes your work, you are part of the problem.

Current copyright is just a tool for folks with power to control that power. It's what a boomer would make driving their tractor / SUV while chanting to themselves: I have earned this.

[–] LWD@lemm.ee 36 points 10 months ago* (last edited 10 months ago) (14 children)
load more comments (14 replies)
load more comments (1 replies)
[–] afraid_of_zombies@lemmy.world 19 points 10 months ago

If the copyright people had their way we wouldn't be able to write a single word without paying them. This whole thing is clearly a fucking money grab. It is not struggling artists being wiped out, it is big corporations suing a well funded startup.

[–] CosmoNova@lemmy.world 18 points 10 months ago* (last edited 10 months ago)

Let's wait until everyone is laid off and it's 'impossible' to get by without mass looting then, shall we?

[–] Boiglenoight@lemmy.world 18 points 10 months ago* (last edited 10 months ago) (1 children)

Piracy by another name. Copyrighted materials are being used for profit by companies that have no intention of compensating the copyright holder.

load more comments (1 replies)
[–] Treczoks@lemmy.world 16 points 10 months ago (1 children)

If a business relies on breaking the law as a fundament of their business model, it is not a business but an organized crime syndicate. A Mafia.

load more comments (1 replies)
[–] holycrap@lemm.ee 16 points 10 months ago

I have the perfect solution. Shorten the copyright duration.

[–] phillaholic@lemm.ee 15 points 10 months ago (11 children)

A ton of people need to read some basic background on how copyright, trademark, and patents protect people. Having none of those things would be horrible for modern society. Wiping out millions of jobs, medical advancements, and putting control into the hands of companies who can steal and strongarm the best. If you want to live in a world run by Mafia style big business then sure.

load more comments (11 replies)
load more comments
view more: next ›