this post was submitted on 07 Mar 2024
299 points (92.4% liked)

Memes

1157 readers
1 users here now

founded 2 years ago
MODERATORS
 
you are viewing a single comment's thread
view the rest of the comments
[–] UraniumBlazer@lemm.ee 2 points 7 months ago* (last edited 7 months ago) (1 children)

Correct. So basically, you are talking about it adjusting its own weights while talking to you. It does this in training but not in deployment. The reason why it doesn't do this in deployment is to prevent bad training data from worsening the quality of the model. All data needs to be vetted before training.

However, if you look at the training phase, it does this as you said. So in short, it doesn't adjust its weights in production because it can't, but because WE have prevented it from doing so.

Now about needing to learn and "mutate" to be sentient in deployment. I don't think that this is necessary for sentience. Take a look at Alzheimer's patients. They remember shit from decades ago while forgetting recent stuff. Are they not sentient? An Alzheimer's patient wouldn't be able to take up a new skill (which requires adjusting of neural weights). It still doesn't make them non sentient, does it?

[–] Scubus@sh.itjust.works 1 points 7 months ago (1 children)

That's a tough one. Honestly, and I'm probably going to receive hate for this, but my gut isntinct would be that no, they are not sentient in the traditional sense of the word. If you harm them and they can't remember it a moment later, are they really living? Or are they just an echo of the past?

[–] UraniumBlazer@lemm.ee 2 points 7 months ago

This just shows that we have different definitions for sentience. I define sentience as the ability to be self aware and the ability to link senses of external stimuli to the self. Your definition involves short term memory and weight adjustment as well.

However, there is no consensus in the definition of sentience yet for a variety of reasons. Hence, none of our definitions are "wrong". At least not yet.