this post was submitted on 24 Aug 2023
431 points (88.0% liked)
Technology
59204 readers
3598 users here now
This is a most excellent place for technology news and articles.
Our Rules
- Follow the lemmy.world rules.
- Only tech related content.
- Be excellent to each another!
- Mod approved content bots can post up to 10 articles per day.
- Threads asking for personal tech support may be deleted.
- Politics threads may be removed.
- No memes allowed as posts, OK to post as comments.
- Only approved bots from the list below, to ask if your bot can be added please contact us.
- Check for duplicates before posting, duplicates may be removed
Approved Bots
founded 1 year ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
If it’s only as good as the data it’s trained on, garbage in / garbage out, then in my opinion it’s “machine learning,” not “artificial intelligence.”
Intelligence has to include some critical, discriminating faculty. Not just pattern matching vomit.
We don't yet have the technology to create actual artificial intelligence. It's an annoyingly pervasive misnomer.
And the media isn't helping. The title of the article is "Google’s Search AI Says Slavery Was Good, Actually." It should be "Google’s Search LLM Says Slavery Was Good, Actually."
Yup, "AI" is the current buzzword.
Hey, just like blockchain tech!
Unfortunately, people who grow up in racist groups also tend to be racist. Slavery used to be considered normal and justified for various reasons. For many, killing someone who has a religion or belief different than you is ok. I am not advocating for moral relativism, just pointing out that a computer learns what is or is not moral in the same way that humans do, from other humans.
You make a good point. Though humans at least sometimes do some critical thinking between absorbing something and then acting it out.
Not enough. Not enough.
Scathing and accurate when your point is made about people too.