this post was submitted on 19 Jan 2024
1 points (100.0% liked)

Singularity

15 readers
1 users here now

Everything pertaining to the technological singularity and related topics, e.g. AI, human enhancement, etc.

founded 2 years ago
MODERATORS
 
This is an automated archive.

The original was posted on /r/singularity by /u/mvnnyvevwofrb on 2024-01-19 04:00:15+00:00.


Assume that it's impossible for AI ever to become sentient. Meaning that it can't think, it doesn't have feelings, or consciousness. What would be the limits of AI in this case? Would it be able to reason like a human being? Or would it always have issues, like hallucinations, or make some kind of errors, or not have the insight of a human being? Or would it really matter?

no comments (yet)
sorted by: hot top controversial new old
there doesn't seem to be anything here