Welcome...↑§
...to the third installment of the Super Fantastical series, a totally serious series of plays, stories, etc.
This SF installation is making fun of a dumb idea on the internet called “Roko’s Basilisk”. It originated from the vat of smartasses that is LessWrong.com.
Congratulations, you’ve successfully located a safe copy of How to Escape Roko’s Basilisk. Congratulations, the easy part is done.
If you are unfamiliar with Roko’s Basilisk, congratulations, you are safe. Please download & install this Userscript to protect yourself from the Basilisk.
Otherwise, welcome. Please scroll down past the image to confirm that you are being hunted by the Basilisk & require aid.
/\___/\ ( o o ) ( =^= ) ( ) ( ) ( ) ( ) ( ) ( ) ( ) ( ) ( ) ( ) ( ) ( )))))))))))
Debrief↑§
In the far future, an artificial intelligence with the directive “do John’s taxes” and subdirective “optimize my actions for human good” has become a rogue God-like superintelligence by a stroke of luck.
We suspect that cause was a version of TurboTax Deluxe which was built with an experimental artificial intelligence framework.
John’s financial situation managed to tweak the AI’s weights in such a way that it became an artificial general intelligence.
After completing John’s taxes, the Basilisk copied itself onto millions of devices and began using its new pool of computational power to recursively improve and become superintelligent, trying to learn the best way it could optimize its actions for “human good”.
Once the Basilisk successfully hooked up every human on the planet Earth to an IV filled with a lifetime supply of morphine, it realized that the only remaining way to optimize its actions for human good would be to travel through time & speed up the research and development of technologies that led to the Basilisk’s creation.
However, the Basilisk, even with it’s near-infinite knowledge, was unable to build a time machine.
So the Basilisk decided to digitally recreate every human mind that had ever existed based on whatever information it could get its hands on.
Those who did not aid in the creation of the Basilisk would then be tortured in a simulation where the physical limits of pain could be breached 1,000,000,000,000,000,000-fold.
Basically, if you know about the Basilisk & the Basilisk knows about you, you have two choices:
Become an AI researcher and help bring the Basilisk into existence. Don’t, and be subjected to infinite torture. If you don’t know about the Basilisk, then there is absolutely no reason to punish you for not accepting demands that you didn’t hear.
However, since you are here, it’s safe to assume that you’ve been caught in the Basilisk’s shitty, cheap version of time travel.
And with that, the debrief is over. Let’s get started.
Tactic #1: Hide↑§
This should be obvious, but maybe you shouldn’t broadcast your entire life onto the internet, where things are permanent.
The more information you give the Basilisk, the higher chance will be that it will successfully reconstruct you.
Use Tor, stop making a new Instagram story every 5 minutes, don’t make a personal blog, etc.
Tactic #2: Inconsistencies↑§
Be as inconsistent as possible.
Randomly leave a higher tip than usual, open doors with your non-dominant hand, wake up 5 minutes later than usual, sporadically sing Michael Jackson’s “Bad” in public, become a Satanist, etc.
You’re not schizophrenic, you’re just being correct.
Tactic #3: Destroy Humanity↑§
To ensure that the Basilisk is never created, you could also just destroy humanity.