this post was submitted on 27 Jun 2023
369 points (99.5% liked)
Technology
59588 readers
3084 users here now
This is a most excellent place for technology news and articles.
Our Rules
- Follow the lemmy.world rules.
- Only tech related content.
- Be excellent to each another!
- Mod approved content bots can post up to 10 articles per day.
- Threads asking for personal tech support may be deleted.
- Politics threads may be removed.
- No memes allowed as posts, OK to post as comments.
- Only approved bots from the list below, to ask if your bot can be added please contact us.
- Check for duplicates before posting, duplicates may be removed
Approved Bots
founded 1 year ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
But if we know that it makes things up and gets things wrong, how can we trust any information it gives us? Fact-checking is one thing, but at that point, you might as well skip the LLM and just look the information up yourself.
At the end of the day you can't 100% trust anything you see on the internet. You have to think critically about the answers it gives you and cross reference it against other sources. No different than when evaluating search results, which can also be wrong. But it's a great starting point.
It's a lot easier to get a thorough and concise answer from chat gpt and double check it than it is to wade through a search engine.