this post was submitted on 17 Jul 2023
356 points (96.8% liked)
Asklemmy
43963 readers
2387 users here now
A loosely moderated place to ask open-ended questions
If your post meets the following criteria, it's welcome here!
- Open-ended question
- Not offensive: at this point, we do not have the bandwidth to moderate overtly political discussions. Assume best intent and be excellent to each other.
- Not regarding using or support for Lemmy: context, see the list of support communities and tools for finding communities below
- Not ad nauseam inducing: please make sure it is a question that would be new to most members
- An actual topic of discussion
Looking for support?
Looking for a community?
- Lemmyverse: community search
- sub.rehab: maps old subreddits to fediverse options, marks official as such
- !lemmy411@lemmy.ca: a community for finding communities
~Icon~ ~by~ ~@Double_A@discuss.tchncs.de~
founded 5 years ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
It’s frustrating to see peoples’ imaginations run wild with AI. They’re not building “sentient” machines. There will never be machines that are sentient in anything other than appearance, and we’re notoriously easy to fool in that way.
My favorite way to describe AI that I’ve heard is “applied statistics.” It’s basically just processing huge amounts of data, very fast, simultaneously, and then presenting conclusions that are usually very likely.
Yes, it will be used to make weapons that are horrifically efficient, but likewise it will be used to make defenses that are equally efficient.
I think the good will ultimately outweigh the bad. Hopefully by a long shot.
LLMs are spontaneously developing theory of mind and nobody knows why or how, meaning that now ChatGPT and the like are able to consider what the user is thinking, opening some avenues for actual manipulation. GPT-4 can solve 95% of ToM tasks that a 7-year-old could.
Source: https://arxiv.org/abs/2302.02083