this post was submitted on 20 Apr 2024
485 points (98.2% liked)

Gaming

20006 readers
24 users here now

Sub for any gaming related content!

Rules:

founded 5 years ago
MODERATORS
top 50 comments
sorted by: hot top controversial new old
[–] makingStuffForFun@lemmy.ml 170 points 7 months ago (2 children)

Imagine reading that headline 20 years ago.

[–] Demdaru@lemmy.world 53 points 7 months ago (2 children)

God that would sound so dystopian and futuristic...but to be honest, most articles about AI today would sound like that back then. Damn people would freak out about privacy.

[–] melpomenesclevage@lemm.ee 8 points 6 months ago

pretty sure they didn't.

load more comments (1 replies)
[–] catloaf@lemm.ee 12 points 6 months ago

BOINC came out 21 years ago, so it wouldn't be that unreasonable.

[–] redcalcium@lemmy.institute 137 points 7 months ago (4 children)

So, it's like folding@home, but instead of donating your spare compute to science, you sell it to generate porn?

[–] Someonelol@lemmy.dbzer0.com 68 points 6 months ago

Porning@home

[–] petersr@lemmy.world 20 points 6 months ago

Can we at least see it?

[–] nucleative@lemmy.world 8 points 6 months ago

This... This was inevitable.

load more comments (1 replies)
[–] cygon@lemmy.world 93 points 6 months ago (1 children)

So... this AI company gets gaming teens to "donate" their computing power, rather than pay for render farms / GPU clouds?

And then oblivious parents pay the power bills, effectively covering the computing costs of the AI porn company?

Sounds completely ethical to me /s.

[–] fidodo@lemmy.world 9 points 6 months ago

No no, they're getting copies of digital images out of it. It's a totally fair trade!

[–] PaupersSerenade@sh.itjust.works 84 points 7 months ago (21 children)

I’ll be a minority voice considering the other comments. But maybe just pay for onlyfans or whatever you guys use. I’m a generally attractive woman (I can surmise from interactions while trying to date) and I really don’t like the idea that my likeness would be used for something like this. Get your jollies off, but try and be a bit consensual about it. Is that so much to ask?

[–] GrymEdm@lemmy.world 67 points 7 months ago* (last edited 7 months ago) (3 children)

It isn't too much to ask. According to Dr. K of HealthyGamerGG (Harvard Psychiatrist/Instructor), research shows that the release of non-consensual porn makes the unwilling subjects suicidal over half the time. Non-consensual porn = deepfakes, revenge porn, etc. It's seriously harmful, and there are other effects like depression, shame, PTSD, anxiety, and so on. There is functionally unlimited porn out there that is made with consent, and if someone doesn't want to be publicly sexually explicit then that's their choice.

I'm not against AI porn in general (I consider it the modern version of dirty drawings/cartoons), but when it comes to specific likenesses as with deepfakes then there's clear proof of harm and that's enough for me to oppose it. I don't believe there's some inherent right to see specific people naked against their will.

[–] fidodo@lemmy.world 10 points 6 months ago

I think it would be too big of a privacy overreach to try to ban it outright as I think what people do on their own computers is their own business and there's no way to enforce a full ban without being incredibly intrusive, but as soon as it gets distributed in any way I think it should be prosecuted as heavily as real non consensual porn that was taken against someone's will.

load more comments (2 replies)

I think the key is a lot of people don't want to pay for porn. And in the case of deep fakes, it's stuff they literally cannot pay money to get.

[–] venoft@lemmy.world 20 points 6 months ago (7 children)

Ai porn isn't deepfake porn. The default is just a random ai generated face and body. Unless you want to it's difficult to deepfake someone.

load more comments (7 replies)
[–] ArbiterXero@lemmy.world 12 points 7 months ago (1 children)

So I’m not disagreeing with you, but you’re assuming they’re making deepfake images, and the article doesn’t specify that. In fact I’d bet that it’s just AI generated “people” that don’t exist.

What about AI porn of a person that doesn’t exist?

[–] Arbiter@lemmy.world 25 points 7 months ago (1 children)

However, one of Salad's clients is CivitAi, a platform for sharing AI generated images which has previously been investigated by 404 media. It found that the service hosts image generating AI models of specific people, whose image can then be combined with pornographic AI models to generate non-consensual sexual images.

[–] ArbiterXero@lemmy.world 12 points 7 months ago (1 children)

Fair, somehow I missed that

load more comments (1 replies)
[–] BudgetBandit@sh.itjust.works 7 points 6 months ago

I know someone who’s into really dark romance stuff, like really hardcore stuff, but she’d never do some of this due to safety reasons. I can totally see her generating scenes of herself in those situations.

load more comments (16 replies)
[–] frightful_hobgoblin@lemmy.ml 54 points 7 months ago (1 children)

Capitalism breeds innovation

load more comments (1 replies)
[–] Flyberius@hexbear.net 49 points 6 months ago (1 children)

I remember when GPUs were used to fold proteins...

[–] Snowyday@startrek.website 15 points 6 months ago (1 children)

I wore an onion on my belt

[–] SturgiesYrFase@lemmy.ml 8 points 6 months ago

As was the fashion at the time

[–] fckreddit@lemmy.ml 49 points 7 months ago

This feels exploitative AF on multiple levels.

[–] PDFuego@lemmy.world 41 points 6 months ago (2 children)

If I'm reading this right, it's a program that users sign up for to donate their processing power (and can opt in or out of adult content), which is then used by client companies to generate their own users' content? It even says that Salad can't view or moderate the images, so what exactly are they doing wrong besides providing service to potentially questionable companies? It makes as much sense as blaming Nvidia or Microsoft, am I missing something?

[–] Cethin@lemmy.zip 24 points 6 months ago (2 children)

Based on the rewards, I'm assuming it's being done by very young people. Presumably the value of rewards is really low, but these kids haven't done the cost-benefit analysis. If I had to guess, for the vast majority it costs more in electricity than they get back, but the parents don't know it's happening.

This could be totally wrong. I haven't looked into it. This is how most of these things work though. They prey on the youth and their desire for these products to take advantage of them.

load more comments (2 replies)
[–] fidodo@lemmy.world 8 points 6 months ago (1 children)

so what exactly are they doing wrong besides providing service to potentially questionable companies?

Well I think that is the main point of what is wrong. I think the big question is whether the mature content toggle is on by default or not. The company says it's off, but some users said otherwise. Dunno why the author didn't install it and check.

[–] PDFuego@lemmy.world 7 points 6 months ago (2 children)

They said they did.

However, by default the software settings opt users into generating adult content. An option exists to "configure workload types manually" which enables users to uncheck the "Adult Content Workloads" option (via 404 media), however this is easily missed in the setup process, which I duly tested for myself to confirm.

Honestly, and I'm not saying I support what's being done here, the way I see it if you're tech savvy enough to be interested in using a program like this you should be looking through all of the options properly anyway. If users don't care what they're doing and are only interested in the rewards that's kind of on them.

I just think the article is focused on the wrong company, Salad is selling a tool that is being potentially misused by users of their client's service. I can certainly see why that can be a problem, but based on the information given in the article I don't think it's really theirs. If that's ALL Salad's used for then that's a different story.

load more comments (2 replies)
[–] DestroyerOfWorlds@sh.itjust.works 33 points 7 months ago (1 children)

explain this to a person in 1998

load more comments (1 replies)
[–] pokexpert30@lemmy.pussthecat.org 27 points 6 months ago (2 children)

I kinda fail to see the problem. The GPU owner doesn't see what workload they are processing. The pr0n company is willing to pay for GPU power. The GPU owner wants to earn money with his hardware. There's a demand, there's an offer, nobody is getting hurt (ai pr0n is not illegal, at least for now) so let people what they want to do

[–] mavu@discuss.tchncs.de 14 points 6 months ago (1 children)

The problem is that they are clearly targeting minors who don't pay their own electricity bill, and dont even neccessarily have awareness that they are paying for their fortnite skins with their parents money. Also: there is a good chance that the generated pictures are at some point present on in the filesystem of the generating computer, and that alone is a giant can of worms that can even lead to legal troubles, if the person lives in a country where some or all kinds of pronography are illegal.

This is a shitty grift, abusing people who don't understand the consequences of the software.

load more comments (1 replies)
load more comments (1 replies)
[–] SSJ2Marx@hexbear.net 20 points 7 months ago

wow. Imagine burning out your expensive GPU for a fortnite skin.

[–] Mango@lemmy.world 14 points 6 months ago (1 children)

Great. Now we're trading pre-made traditional artwork to kids in exchange for fresh robot porn!

load more comments (1 replies)
[–] Raiderkev@lemmy.world 13 points 6 months ago (1 children)

You would think they would do this to mine Bitcoin too.

load more comments (1 replies)
[–] Agora@discuss.tchncs.de 12 points 6 months ago

Boring Dystopia

[–] Gemini24601@lemmy.world 11 points 7 months ago (1 children)

What? Seems like porn generation is the new crypto mining.

[–] DudeDudenson@lemmings.world 16 points 7 months ago (1 children)

I'd rather have a wealth of new porn around rather than thousands random Blockchains going around.

At least the porn will probably be useful for someone long term haha

load more comments (1 replies)
load more comments
view more: next ›