this post was submitted on 13 Jun 2024
101 points (100.0% liked)

Technology

37739 readers
493 users here now

A nice place to discuss rumors, happenings, innovations, and challenges in the technology sphere. We also welcome discussions on the intersections of technology and society. If it’s technological news or discussion of technology, it probably belongs here.

Remember the overriding ethos on Beehaw: Be(e) Nice. Each user you encounter here is a person, and should be treated with kindness (even if they’re wrong, or use a Linux distro you don’t like). Personal attacks will not be tolerated.

Subcommunities on Beehaw:


This community's icon was made by Aaron Schneider, under the CC-BY-NC-SA 4.0 license.

founded 2 years ago
MODERATORS
 

Company he works at eternos.life

you are viewing a single comment's thread
view the rest of the comments
[–] henfredemars@infosec.pub 4 points 5 months ago* (last edited 5 months ago) (1 children)

Meant, in this context, refers to the conditions that humans have faced over a long period of time and may be more suited to coping with from a survival point of view. I'm an atheist, so I find it strange that you chose to read my comment as highlighting intentional design. Certainly, AI has existed for a much shorter time than the phenomenon on a human encountering the death of a loved one. Indeed, death has been quite a common theme throughout history, and the tools and support available to cope with it and relate to other human experiences far exceed those for coping with the potential issues that come with AI.

I think one can absolutely speak of needs and adaptation for something as common a human experience as death. If you find something belittling about that opinion, I'm not sure how to address you further. I may simply have to be wrong.

[–] frog@beehaw.org 4 points 5 months ago

Just gonna say that I agree with you on this. Humans have evolved over millions of years to emotionally respond to their environment. There's certainly evidence that many of the mental health problems we see today, particularly at the scale we see, is in part due to the fact that we evolved to live in a very different way to our present lifestyles. And that's not about living in cities rather than caves, but more to do with the amount of work we do each day, the availability and accessability of essential resources, the sense of community and connectedness with small social groups, and so on.

We know that death has been a constant of our existence for as long as life has existed, so it logically follows that dealing with death and grief is something we've evolved to do. Namely, we evolved to grieve for a member of our "tribe", and then move on. We can't let go immediately, because we need to be able to maintain relationships across brief separations, but holding on forever to a relationship that can never be continued would make any creature unable to focus on the needs of the present and future.

AI simulacrums of the deceased give the illusion of maintaining the relationship with the deceased. It is certainly well within the possibility that this will prolong the grieving process artificially, when the natural cycle of grieving is to eventually reach a point of acceptance. I don't know for sure that's what would happen... but I would want to be absolutely sure it's not going to cause harm before unleashing this AI on the general public, particularly vulnerable people (which grieving people are.)

Although I say that about all AI, so maybe I'm biased by the ridiculous ideology that new technologies should be tested and regulated before vulnerable people are experimented on.