this post was submitted on 03 Jul 2023
20 points (95.5% liked)

Asklemmy

43940 readers
615 users here now

A loosely moderated place to ask open-ended questions

Search asklemmy ๐Ÿ”

If your post meets the following criteria, it's welcome here!

  1. Open-ended question
  2. Not offensive: at this point, we do not have the bandwidth to moderate overtly political discussions. Assume best intent and be excellent to each other.
  3. Not regarding using or support for Lemmy: context, see the list of support communities and tools for finding communities below
  4. Not ad nauseam inducing: please make sure it is a question that would be new to most members
  5. An actual topic of discussion

Looking for support?

Looking for a community?

~Icon~ ~by~ ~@Double_A@discuss.tchncs.de~

founded 5 years ago
MODERATORS
 

The Singularity is a hypothetical future event where technology growth becomes uncontrollable and irreversible, leading to unpredictable transformations in our reality^1^. It's often associated with the point at which artificial intelligence surpasses human intelligence, potentially causing radical changes in society. I'd like to know your thoughts on what the Singularity's endgame will be: Utopia, Dystopia, Collapse, or Extinction, and why?

Citations:

  1. Singularity Endgame: Utopia, Dystopia, Collapse, or Extinction? (It's actually up to you!)
you are viewing a single comment's thread
view the rest of the comments
[โ€“] erogenouswarzone@lemmy.ml 1 points 1 year ago

I'll do you one step better. What about when our ai meets another ai?

Our existence is based on death and war. There is a lot of evidence to suggest we killed off all the other human-like species, such as neanderthals.

And that is the reason we progressed to a state where we have developed our world and society we know today, and all the other species are just fossils.

We were the most aggressive and bloodthirsty species of all the other aggressive and bloodthirsty alternatives, and even though we have domesticated our world, we have only begun to domesticate ourselves.

Think about how we still have seen genocides in our own time.

Our AI will hopefully pacify these instincts. Most likely not without a fight from certain parties that will consider their right to war absolute.

Like the one ring, how much of the agressiveness will get poured into our AI?

What if our AI, in the exploration of space, encounters another AI? Will it be like the early humanoid species, where we either wipe out or get wiped out ourselves?

Will our AIs have completely abstracted away all the senseless violence?

If you want a really depressing answer, read the second book of 3 body problem: The Dark Forest.