this post was submitted on 03 Jul 2023
19 points (100.0% liked)

Asklemmy

1454 readers
80 users here now

A loosely moderated place to ask open-ended questions

Search asklemmy 🔍

If your post meets the following criteria, it's welcome here!

  1. Open-ended question
  2. Not offensive: at this point, we do not have the bandwidth to moderate overtly political discussions. Assume best intent and be excellent to each other.
  3. Not regarding using or support for Lemmy: context, see the list of support communities and tools for finding communities below
  4. Not ad nauseam inducing: please make sure it is a question that would be new to most members
  5. An actual topic of discussion

Looking for support?

Looking for a community?

~Icon~ ~by~ ~@Double_A@discuss.tchncs.de~

founded 5 years ago
MODERATORS
 

The Singularity is a hypothetical future event where technology growth becomes uncontrollable and irreversible, leading to unpredictable transformations in our reality^1^. It's often associated with the point at which artificial intelligence surpasses human intelligence, potentially causing radical changes in society. I'd like to know your thoughts on what the Singularity's endgame will be: Utopia, Dystopia, Collapse, or Extinction, and why?

Citations:

  1. Singularity Endgame: Utopia, Dystopia, Collapse, or Extinction? (It's actually up to you!)
top 14 comments
sorted by: hot top controversial new old
[–] mrmanager@lemmy.today 5 points 1 year ago* (last edited 1 year ago)

Well, let me put it this way... Enjoy your days now, not later. :)

And prepare to move to a country where tech is not very widespread. Try to gather money so you can move if you want to.

Humans can be really nice on a individual level but society is run by evil people. I think it has always been that way. Good people don't want any part of the power struggles and backstabbing, so they forfeit power to the people who are into that. By design, the system rewards evil people. And they are also the ones who really care about money, status, and so on.

This means humanity is fucked. It's pretty simple. Unless consciousness somehow changes in everybody at once, and everyone suddenly wants to do good instead of evil. Then we have a good chance. The tech can help build a paradise here for everyone.

But that won't happen unless good aliens somehow transforms our minds into something completely different.

[–] HobbitFoot@thelemmy.club 3 points 1 year ago (2 children)

It really depends on what AI we raise.

[–] SturgiesYrFase@lemmy.ml 3 points 1 year ago

So we're fucked then.....

[–] SturgiesYrFase@lemmy.ml 1 points 1 year ago

So we're fucked then.....

[–] bloodfart@lemmy.ml 3 points 1 year ago* (last edited 1 year ago)

There will not be a singularity. Global capitalism will absolutely collapse and on its way will become more dystopian. Humanity isn’t going extinct.

E: the cause of this process is not human nature. Anyone who tells you it is has simply failed to study history. We can have a utopia but global capital has to collapse first to make space for it.

[–] erogenouswarzone@lemmy.ml 2 points 1 year ago

I'll do you one step better. What about when our ai meets another ai?

Our existence is based on death and war. There is a lot of evidence to suggest we killed off all the other human-like species, such as neanderthals.

And that is the reason we progressed to a state where we have developed our world and society we know today, and all the other species are just fossils.

We were the most aggressive and bloodthirsty species of all the other aggressive and bloodthirsty alternatives, and even though we have domesticated our world, we have only begun to domesticate ourselves.

Think about how we still have seen genocides in our own time.

Our AI will hopefully pacify these instincts. Most likely not without a fight from certain parties that will consider their right to war absolute.

Like the one ring, how much of the agressiveness will get poured into our AI?

What if our AI, in the exploration of space, encounters another AI? Will it be like the early humanoid species, where we either wipe out or get wiped out ourselves?

Will our AIs have completely abstracted away all the senseless violence?

If you want a really depressing answer, read the second book of 3 body problem: The Dark Forest.

[–] InternetPirate@lemmy.fmhy.ml 2 points 1 year ago* (last edited 1 year ago)

According to Connor Leahy, companies are currently engaged in a race to be the first ones to achieve AGI, prioritizing speed over security, as mentioned in his video (source). I firmly believe that unless significant changes occur, we are headed towards extinction. We may succeed in creating a highly powerful AGI, but it might disregard our existence and eventually destroy us—not out of malicious intent, but simply because we would be in its way. In the same way humans don't consider ants when constructing a road. I wish more people were discussing it because it will be too late in a few years.

[–] Adderbox76@lemmy.ca 2 points 1 year ago

All of the above.

Humanity is, at it's core, motivated by self interest. The singularity will be harnessed by those with the power and means to do so, while those who don't will either suffer or die.

The powerful few will adapt to the singularity; using it to craft their own utopia. The masses, without access to the same power that the upper class enjoyed, will fall into a dystopia while even more marginalized substrates of society go extinct completely unnoticed.

[–] Lamy@lemmy.fmhy.ml 2 points 1 year ago (1 children)

AI doesn’t think like that

[–] tgxn@lemmy.tgxn.net 1 points 1 year ago (1 children)
[–] Lamy@lemmy.fmhy.ml 1 points 1 year ago
[–] natarey 2 points 1 year ago* (last edited 1 year ago)

Genuinely, I don't think the answer is any of those. Human beings are notoriously bad at predicting the long-term future. I think where we're headed is going to be stranger than anyone imagines or can imagine.

[–] redballooon@lemm.ee 0 points 1 year ago

The singularity already happened. We have corporations that are unregulatable. They create their own rules and use those rules to grow further, on the cost of our all resources. AI will be used by those corporations to grow further, but it won’t be the game changer towards the dystopia we’re already living and expanding.

[–] RedCanasta@lemmy.fmhy.ml 0 points 1 year ago

Almost every comment I've seen sees the future as hopeless and I'm going to largely chalk that up to the postmodernism/realism consciousness in our society at this time period.

I think the future will be a utopia, and there isn't a long term (I mean centuries or millenia long developments) reason to think otherwise. The idea of utopia has pushed civilization to confront power structures and create new ones, to rethink what was impossible, too difficult to accomplish, etc. The many rights, freedoms, and ideas that many around the world take for granted today began as people envisioning a utopia and trying to make it happen. These ideas can't be done away with as Alexis De Tocqueville saw.

Right now there are problems for sure, and I personally think liberty and egality are only a parody of utopia at this point, but that'll change over a long time.

Human civilization is only 6000 years old! We're still working with the brain of primitive humans, and we aren't even toddlers yet in the grand lifespan of Earth. I think people tend to forget that sometimes.

We'll get to a better place, and our consciousness is always changing to confront the problems we face today (biosphere collapse, resource hoarding, infighting, etc).

Democracy took centuries to develop coherently, and even then it failed MANY times at first. But look at it now.

load more comments
view more: next ›