this post was submitted on 28 Nov 2024
425 points (98.4% liked)

Science Memes

11299 readers
2523 users here now

Welcome to c/science_memes @ Mander.xyz!

A place for majestic STEMLORD peacocking, as well as memes about the realities of working in a lab.



Rules

  1. Don't throw mud. Behave like an intellectual and remember the human.
  2. Keep it rooted (on topic).
  3. No spam.
  4. Infographics welcome, get schooled.

This is a science community. We use the Dawkins definition of meme.



Research Committee

Other Mander Communities

Science and Research

Biology and Life Sciences

Physical Sciences

Humanities and Social Sciences

Practical and Applied Sciences

Memes

Miscellaneous

founded 2 years ago
MODERATORS
 
you are viewing a single comment's thread
view the rest of the comments
[–] dragonfucker -3 points 5 days ago (2 children)

The sweet release of death.

Or, you know, we could devote serious resources to studying the nature of consciousness instead of just pretending like we already have all the answers, and we could use this knowledge to figure out how to treat AI ethically.

Utilitarians believe ethics means increasing happiness. What if we could build AI farms with trillions of simulants doing heroin all the time with no ill effects?

[–] Irelephant@lemm.ee 4 points 5 days ago

We are devoting serious resources to studying the nature of consicousness.

[–] VeganCheesecake@lemmy.blahaj.zone 3 points 5 days ago* (last edited 5 days ago) (1 children)

End commercial usage of LLMs? Honestly, I'm fine with that, why not. Don't have to agree on the reason.

I am not saying understanding the nature of consciousness better wouldn't be great, but there's so much research that deserves much more funding, and that isn't really a LLM problem, but a systemic problem. And I just haven't seen any convincing evidence current Models are conscious, and I don't see how they could be, considering how they work.

I feel like the last part is something the AI from the paperclip thought experiment would do.

[–] dragonfucker -1 points 5 days ago (1 children)

And I just haven't seen any convincing evidence current Models are conscious, and I don't see how they could be, considering how they work.

Drag isn't saying they're conscious either. A being doesn't have to be conscious in order to suffer. Drag is perfectly capable of suffering while unconscious, and if you've ever had a scary dream, so are you. Drag thinks LLMs act like people who are dreaming. Their hallucinations look like dream logic.

[–] VeganCheesecake@lemmy.blahaj.zone 1 points 4 days ago* (last edited 4 days ago)

I mean, I don't agree, but I also don't think I'll be able to shake that opinion, so agree to disagree, I guess.