this post was submitted on 31 Dec 2023
464 points (99.2% liked)
Not The Onion
12272 readers
1734 users here now
Welcome
We're not The Onion! Not affiliated with them in any way! Not operated by them in any way! All the news here is real!
The Rules
Posts must be:
- Links to news stories from...
- ...credible sources, with...
- ...their original headlines, that...
- ...would make people who see the headline think, “That has got to be a story from The Onion, America’s Finest News Source.”
Comments must abide by the server rules for Lemmy.world and generally abstain from trollish, bigoted, or otherwise disruptive behavior that makes this community less fun for everyone.
And that’s basically it!
founded 1 year ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
The problem is breathless AI news stories have made people misunderstand LLMs. The capabilities tend to get a lot of attention, but not so much for the limitations.
And one important limitation of LLM's: they're really bad at being exactly right, while being really good at looking right. So if you ask it to do an arithmetic problem you can't do in your head, it'll give you an answer that looks right. But if you check it with a calculator, you find the only thing right about the answer is how it sounds.
So if you use it to find cases, it's gonna be really good at finding cases that look exactly like what you need. The only problem is, they're not exactly what you need, because they're not real cases.