this stupid reddit question: and my answer
I quote a question from a Reddit user that will remain anonymous (unless you look it up:)
Title: AI that could make us immortal and torture us until the heat death of the universe
S-risks
"I think most people can agree that there are different scenarios for the future of AI. A lot of people think that we will end up in an utopia, dystopia or humanity will face extinction. But there is another scenario, and in my opinion it doesn't get the attention it deserves even though it is probably the one that we should think about the most.
I am talking about future AI systems that would decide to make humans immortal and then torture us with the worst pain possible until the heat death of the universe. And i know this sounds very unlikely but the chance that something like this happens is above 0. Also the literal definition of the technological singularity is that we can't tell what will happen after it. So maybe the AI will be like the christian god and the AI will create heaven for us. Maybe it will be like a monk that just does nothing. Maybe it will do something that our brains could never think of. But maybe the AI is more like the devil and it will put every human in a state of pain and suffering that words can't even describe. If an AI is as powerful as a god than it could invent ways of torture that even the best science fiction writers cant't think of, and it could also make us immortal and therefore we would have to experience this unimaginable suffering until the end of time.
I know it is very unlikely, but shouldn't we do everything in our power to prevent something like this? In my opinion an extinction scenario for humanity sounds like a Disney fairytale in comparison to what could be possible with superintelligent AI, so i don't really unterstand why everyone is saying that the worst case scenario is extinction when it is literally something else that is infinte times worse.
Sorry for my bad english and i would be very thankful to hear some thoughts about this"
My answer:
u/M3st3rartist avatar
M3st3rartist
•
24m ago
•
Edited 6m ago
oh wow. congratulations, you've asked the scariest, most profoundly disturbing question imaginable. But I appreciate it for the reason that it's not completely insane to pose the question, and we need to consider the risks and benefits of every technology we develop. The logical fallacy inherent in your question is that AI is completely independent of human beings. If AI started torturing us beyond our capacity, one of two things would happen: either another AI would be created by chance that would kill us, or we would purposely have to eliminate the human race to alleviate our suffering. I'm pretty sure that's as close as I can get to the answer. Also, hopefully the universe won't spontaneously combust. Hence the ghost in the machine, hence the soul.
also, I wonder how many people in this thread can even define the term artificial intelligence. before you comment on something, it's good to understand the term in question. it's a bit like using the internet, or a computer for that matter, without understanding what a computer is, at least on a basic level, or what the internet is, at a technological level. admission: I can't define the term AI, but I think the logic in my explanation still holds.
Similar Topics | |
---|---|
Question |
23 Oct 2024, 4:07 pm |
Updates + Question |
19 Sep 2024, 9:16 pm |
No job means a gf is out of the question? |
01 Oct 2024, 6:54 pm |
A simple question about being a genius |
24 Oct 2024, 1:43 pm |