this post was submitted on 14 Apr 2024
273 points (91.7% liked)
Futurology
1774 readers
121 users here now
founded 1 year ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
Maybe it's this arbitrary word,
hallucination
? Which was recently borrowed from the human experience to explain why something which normally is factual like a computer is not computing facts.But if one were to think about it, what is the difference between a series on non factual hallucinations in a model and a person's individual experience of the world?
Before, we called these bugs or even issues. But now that it's in this black box of sorts that we can't alter the decision making process of as directly as before. There is this more human sounding name all of a sudden.
To clarify, when an llm gets a fact wrong because it has limited context or because it's foundational model is flawed, is that the same result as the experience someone has after consuming psychedelic mushrooms? No, I wouldn't say so. Nor is it the same when a team of scientists try to make a model actively hallucinate so they can find new chemical compounds.
Defining words can sometimes be very tricky, especially when they are applying to multiple areas of study. The more you drill into a definition, the more it becomes a metaphysical debate. But it is important to have these discussions because even the definition of something like
AGI
keeps changing. And infact only exist because the goal posts for aAI
moved so much. What will stop a company which is trying to attract investors from just slapping anAGI
label on their next release? And how will we differentiate what the spirit of the word is trying to convey from the sales pitch?Hallucinations are not qualia.
Please go talk to an llm for hallucinations, you can use duck duck gos implementation of chatgpt, and see why it's being used to mean a fairly different thing from human hallucinations.