this post was submitted on 23 Sep 2024
1 points (100.0% liked)

Singularity

131 readers
2 users here now

Everything pertaining to the technological singularity and related topics, e.g. AI, human enhancement, etc.

founded 1 year ago
MODERATORS
 
This is an automated archive made by the Lemmit Bot.

The original was posted on /r/singularity by /u/Chmuurkaa_ on 2024-09-23 07:52:13+00:00.


If AI becomes superintelligent, how will we know that it is superintelligent? The point of that is that it would be so smart that we would not even be able to comprehend it, so how would we even measure it? We are superintelligent compared to dogs. Do dogs see us as superintelligent beings? They probably see us as somewhat smarter and that's where their scope of perception probably ends. What if we create superintelligence and be like "Yeah, this thing is really smart. Outperforms all of our experts, but it's not quite ASI just yet, and still does some stuff that seems completely illogical" even though it is, and we are just unable to perceive it, and that "illogical stuff" is the AI playing 6D Chess and we are too stupid to understand it or to accept it as an objectively better solution

no comments (yet)
sorted by: hot top controversial new old
there doesn't seem to be anything here