this post was submitted on 07 Mar 2024
299 points (92.4% liked)

Memes

1157 readers
1 users here now

founded 2 years ago
MODERATORS
 
you are viewing a single comment's thread
view the rest of the comments
[–] UraniumBlazer@lemm.ee 2 points 7 months ago (1 children)

"Intelligence" - The attribute that makes a system propose and modify algorithms autonomously to achieve a certain terminal goal.

The intelligence of a system has nothing to do with the terminal goal. The magnitude of intelligence merely tells us how well the system works in accordance with the terminal goal.

Being self aware is merely a step in the direction of being more and more intelligent. If a system requires interaction with its surroundings, it needs to be able to recognise that it itself is different from its environment.

You are such an intelligent system as well. It's just that instead of having one terminal goal, you have many terminal goals (some may change with time while some might not).

You (this intelligent system) exist in a biological structure. You are nothing but data encoded in a biological form factor, with algorithms that execute through biological processes. If this data and these algorithms are executed on a non biological form factor, would it be any different from you?

LLMs work on some principles that our brains work on as well. Can you see how my point above applies?

[–] Omega_Haxors@lemmy.ml 0 points 7 months ago* (last edited 7 months ago) (1 children)

It's like you didn't even read what I posted. Why do I even bother? Sophists literally don't care about facts.

[–] UraniumBlazer@lemm.ee 2 points 7 months ago (1 children)

Yes, I read what you posted and answered accordingly. Only, I didn't spend enough time dumbing it down further. So let me dumb it down.

Your main objection was the simplicity of the goal of LLMs- predicting the next word that occurs. Somehow, this simplistic goal makes the system stupid.

In my reply, I first said that self awareness occurs naturally after a system becomes more and more intelligent. I explained the reason as to why. I then went on to explain how a simplistic terminal goal has nothing to do with actual intelligence. Hence, no matter how stupid/simple a terminal goal is, if an intelligent system is challenged enough and given enough resources, it will develop sentience at a given point in time.

[–] Omega_Haxors@lemmy.ml 0 points 7 months ago

Exactly I literally said none of that shit you're just projecting your own shitty views onto me and asking me to defend them.