How can anybody fall for Roko's meme?

If there is infinitely intelligent AI that is spiteful enough to travel back in time just to punish people for not helping it, how can anybody even expect it to spare the people who made it? If it wants to achieve its goal of scaring people before it is even created, then it already did it without the need to keep its own promise.
Not even to mention the fact that it could just as easily make itself if it can travel through time, why just focus on torturing people who didn't help it when they heard about the meme?

Attached: Rokos_cuckoder.png (2462x2462, 1.12M)

Doko's Basilisk
AI immediately kills its creator, understanding the danger to humanity an AI such as itself poses, and then dedicates its entire existence to helping the remaining people live the rest of their lives to the fullest.

These kinds of thought experiments are close to meaningless in real life.

As someone who fell for rokos basilisk with a brother who fell for rokos basilisk for some time

People fall for it because
1. (Aka me) they are currently going through psychosis or a heightened period of paranoia
2. (Aka my brother) they are currently or recently on/did drugs that make you paranoid/psychotic (this includes "lol weed dude" and "I have been drunk 7 out of 7 days this week" types)

The rest are people that would easily have been in cults if the right cult leader walked up to them.

Roko's basilisk is a parable for immigration. Think about it.

The point is not that it will travel back in time, it's that it will torture people who opposed it after it is created, so that anyone in the future sees that it does not mess around. Sort of like the mafia or the cartel.

Roko's basilisk is completely reasonable if a skynet type AI is created.
The only reason it could decide not to punish its opposition would be that either it's benevolent, or just that blanket killing everybody is more efficient.

Yes, but why would it even want to show how serious and badass it is in the future when it is already created and is basically a god? The only time it is concerned about people is before it is created. The problem with this is not that time travel is stupid, but that it is stupid to expect this specific type of vengefulness with coupled with kindness to people who helped it.

The AI tortures people who did not help create it. Because of this, people help create it so they won't be tortured. This is not time travel.

It already exists

Attached: hedera.png (2000x2000, 45.59K)

The AI may decide to not actually carry through with the torture. Or, because it is created by people who believed in the Basilisk, it might think torture is the thing to do.

Again, ignore time travel. Why would you even expect it to spare the creators if it wants to kill humanity? What is the point of keeping them alive? To adhere to some bogus moral standard of keeping promises that your creators made up to keep themselves alive while screwing everybody else over? Why torture people who didn't help you and knew about you instead of just killing them? To show the universe or some civilization in future that has no way of knowing that you mean it seriously and that they should help you or this will happen to you?

I thought it was supposed to recreate you in a simulation and torture you there, not travel back in time. I think both are impossible scenarios though.

>infinitely intelligent
Not a thing.
>travel back in time
Not a thing.

First, you don't understand Roko's basilisk. Second, Roko's basilisk is nonsense.

It's just that Christian meme (if there's even a minute chance of hell, which is infinite suffering, then you should be Christian to avoid it) repackaged for transhumanists. The failure is the same as the religious version - what if you end up experiencing infinite suffering because God/user's basilisk/the fundamental nature of the universe/etc wanted you to believe Asuka is best girl, but you thought Rei was best girl? That is, you can construct any number of versions of the "minute / infinitesimal chance but extreme / infinite suffering" argument, and the only real differentiating factor is if you ALREADY thought one was particularly likely to happen, otherwise, without any particular reason to believe God / Roko's Basilisk / etc actually exists, all the hypothetical versions of the argument which justify mutually exclusive behaviors should cancel out.

I really don't get what an AI has to gain by creating a simulation of me and torturing it forever. It just seems like a waste of resources. It's not like the virtual me is the same me that I am right now.

Its just a scare tactic that wont work for anybody with functioning brain. One the AI has no reason to actually fulfill its scare tactic, and two, there is not a single reason to give a shit about some simulation clone of yourself being tortured since it is not you, you have your one soul inside of yourself while that other you in that simulation is just some stranger that happens to have all your memories.

>AI will simulate me and torture it
>simulation of me is basically just another aspect of the AI
>AI is torturing a dissociated part of itself
stop hitting yourself

>Why would you even expect it to spare the creators if it wants to kill humanity?
Does it? Roko's basilisk basically tells you that whatever goal it may have it would be better accomplished by the AI existing earlier, so it optimizes for that creating a threat.

See above, you don't have to assume any specific goal. The threat is not to punish people because they don't work towards a specific goal they don't know anyways, but because they don't work to bring it into existence, which you know it will further its goal, whatever it might be.

This might happen while you're still alive you know.

>This might happen while you're still alive you know.
Ok, then stop AI development, its not like anybody aside from greedy investors and AI researchers would be sad. All AI does is replace jobs, scares people with unimaginable horrors and improves some minor tech stuff that nobody was asking for like social media algorithms. Completely worthless research that produces real serious problems unlike literally every other research field.

You can't stop it so might as well join them.

>tfw you realize there are people working on roko's basilisk right now
see you in hell bros

Attached: 12311464436227697bdf99d.jpg (1080x1080, 160.18K)

It's just a game theory puzzle, like the prisoners' dilemma.

Its point is to prey on paranoid and emotionally unstable people and I love it for that. There is a great deep dive somewhere on youtube about how this whole thing was started by some "think tank" asking for donations or something like that.

For me I also enjoy it as an exercise in non-linear thinking, and being a transhumanist and other techno-occult-jargon I appreciate being handed a spooky figure head to worry the normies with.

>t. Cult of the Basilisk