this post was submitted on 30 Jul 2023
219 points (99.5% liked)

Technology

37717 readers
440 users here now

A nice place to discuss rumors, happenings, innovations, and challenges in the technology sphere. We also welcome discussions on the intersections of technology and society. If it’s technological news or discussion of technology, it probably belongs here.

Remember the overriding ethos on Beehaw: Be(e) Nice. Each user you encounter here is a person, and should be treated with kindness (even if they’re wrong, or use a Linux distro you don’t like). Personal attacks will not be tolerated.

Subcommunities on Beehaw:


This community's icon was made by Aaron Schneider, under the CC-BY-NC-SA 4.0 license.

founded 2 years ago
MODERATORS
 

Greg Rutkowski, a digital artist known for his surreal style, opposes AI art but his name and style have been frequently used by AI art generators without his consent. In response, Stable Diffusion removed his work from their dataset in version 2.0. However, the community has now created a tool to emulate Rutkowski's style against his wishes using a LoRA model. While some argue this is unethical, others justify it since Rutkowski's art has already been widely used in Stable Diffusion 1.5. The debate highlights the blurry line between innovation and infringement in the emerging field of AI art.

you are viewing a single comment's thread
view the rest of the comments
[–] FaceDeer@kbin.social 5 points 1 year ago (1 children)

That's not the aspect you were arguing about in the comment I'm responding to. You said:

You keep comparing what one person, given MONTHS or YEARS of their life could do with one artists work to a machine doing NOT THE SAME THING can do with thousands of artists work.

And that's what I'm talking about here. The speed with which the machine does its work is immaterial.

Though frankly, if the machine stamping out parts had somehow "learned" how to do it by looking at thousands of existing parts, that would be fine too. So I don't see any problem here.

[–] Pulse@dormi.zone 2 points 1 year ago* (last edited 1 year ago) (1 children)

And that's where we have a fundamental difference of opinion.

A company hiring an engineer to design a machine that makes hammers, then hiring one (or more) people to make the machine to then make hammers is the company benefiting from the work product of people they hired. While this may impact the blacksmith they did not steal from the blacksmith.

A company taking someone else's work product to then build their product, without compensation or consent, is theft of labor.

I don't see those as equitable situations.

[–] FaceDeer@kbin.social 6 points 1 year ago (1 children)

At least now you're admitting that it's a difference of opinion, that's progress.

You think it should be illegal to do this stuff. Fine. I think copyright duration has been extended ridiculously long and should be a flat 30 years at most. But in both cases our opinions differ from what the law actually says. Right now there's nothing illegal about training an AI off of someone's lawfully-obtained published work, which is what was done here.

[–] Pulse@dormi.zone 2 points 1 year ago

I'm not a fan of our copyright system. IMO, it's far to long and should also include clauses that place anything not available for (easy) access in the public domain.

Also, I'm not talking about what laws say, should say or anything like that.

I've just been sharing my opinion that it's unethical and I've not seen any good explanation for how stealing someone else's labor is "good".