this post was submitted on 17 Jan 2024
1 points (100.0% liked)

Singularity

15 readers
1 users here now

Everything pertaining to the technological singularity and related topics, e.g. AI, human enhancement, etc.

founded 2 years ago
MODERATORS
 
This is an automated archive.

The original was posted on /r/singularity by /u/ResponsiveSignature on 2024-01-16 22:55:15+00:00.


The risk of something going wrong with AI will be 1000x greater if a model is released to the world. As a consequence, the first company to achieve AGI will keep it behind closed doors and try to establish a new world order, offering, at best, the fruits of the AGI but never direct access to it.

As a result, the likely AGI future will leave humans placated but disempowered. The chaos of humans all battling with god powers against each other could never hope to stabilize without massive destruction.

no comments (yet)
sorted by: hot top controversial new old
there doesn't seem to be anything here