loose
Irony?
A "Showerthought" is a simple term used to describe the thoughts that pop into your head while you're doing everyday things like taking a shower, driving, or just daydreaming. A showerthought should offer a unique perspective on an ordinary part of life.
loose
Irony?
must of made a mistake their
your so dumb lmao
thank you kind stranger
Should of proof red it
I need to of a word with you
Knead*
Now when you submit text to chat GPT, it responds with “this.”
Unironically this
Criminaly underated post
As a language model, I laughed at this way harder than I should have
NTA, that was funny.
I'm waiting for it to start using units of banana for all quantities of things
ChatGPT trained used Reddit posts -> ChatGPT goes temporarily “insane”
Coincidence? I don't think so.
This is exactly what I was thinking.
And maybe some more people did what i did. Not deleting my accounts but replacing all my posts with content created by a bullshit-generator. Made texts look normal, but everything was completely senseless.
They have always trained on reddit data, like, gpt2 was, i'm unsure about gpt1
ChatGPT also chooses that guy's dead wife
The Narwhal Bacons at Midnight.
On the contrary, it'll becomes excessively perfectionist about it. Can't even say "could have" without someone coming in and saying "THANK YOU FOR NOT SAYING OF"
It already was, the only difference is that now reddit is getting paid for it.
Its going to be a poop knife wielding guy with 2 broken arms out to get those jackdaws.
From now on, when you say something like "I think I can give my hoodie to my girlfriend", it will answer"and my axe""
GROND
It was already trained on Reddit posts. It's just now they're paying for it.
ChatGPT was already trained on Reddit data. Check this video to see how one reddit username caused bugs on it: https://youtu.be/WO2X3oZEJOA?si=maWhUpJRf0ZSF_1T
And between were, we’re and where.
Insure and ensure.
"Can't even breath"
Your right.
"What is a giraffe?"
ChatGPT: "geraffes are so dumb."
“I have not been trained to answer questions about stupid long horses.”
And then and than.
And when it learns something new, the response will be "Holy Hell".
TIL
Sure it might have some effect, but a big part of ChatGPT besides "raw" training data is RLHF, reinforcement learning from human feedback. Realistically, the bigger problem is training on AI-generated content that might have correct spelling, but hardly makes sense.
Then I did the right thing by replacing my texts with correct spelled nonsense.
Is it a showerthought if it's actually just incorrect