this post was submitted on 18 Jun 2024
94 points (100.0% liked)
TechTakes
1427 readers
236 users here now
Big brain tech dude got yet another clueless take over at HackerNews etc? Here's the place to vent. Orange site, VC foolishness, all welcome.
This is not debate club. Unless it’s amusing debate.
For actually-good tech, you want our NotAwfulTech community
founded 1 year ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
Statements from LLMs are to be seen as hallucinations unless proven otherwise by classic research.
We don't need a fancy word that makes it sound like AI is actually intelligent when talking about how AI is frequently wrong and unreliable. AI being wrong is like someone who misunderstood something or took a joke as literal repeating it as factual.
When people are wrong we don't call it hallucinating unless their senses are altered. AI doesn't have senses.
It's not a "fancy word" here, but a technical term. An AI making things up is actually called hallucination.
Lmao