FunkyCasual

joined 1 year ago
[–] FunkyCasual@lemmygrad.ml 2 points 6 months ago

I accidentally used this loophole with Starfield.

Wrote a whole thing pleading my case thinking there might be a slim chance of it getting approved since I was considerably over the 2 hour limit. But it ended up just getting auto approved within an hour or two.

I was initially shocked, but quickly realized what happened.

[–] FunkyCasual@lemmygrad.ml 6 points 1 year ago

The people generating it are rarely the ones who are training the models. They take pretrained models and prompt them for what they want.

Even if they were training a model for a specific subject, they could train it with any pictures of the subject and combine it with another model that can generate the kind of image they want.

There is absolutely no reason they would need abuse images to use for training. There are far better general nsfw models available right now than they could ever train themselves.

[–] FunkyCasual@lemmygrad.ml 6 points 1 year ago (2 children)

No, I'm saying the models aren't being trained with actual CSAM. The comment I replied to was about training, not generation.

All I was saying is that you don't need to train a model on child abuse images to get it to output child abuse images

[–] FunkyCasual@lemmygrad.ml 2 points 1 year ago (4 children)

That's because it isn't happening

There's just no reason to do so

[–] FunkyCasual@lemmygrad.ml 6 points 1 year ago

Let's not forget the real pro gamer mover

GBA -> DS