Someone please explain Roko’s Basilisk to me. I have a loose understanding of what rationalism is, I’m assuming that should put me on about the same level as a rationalist’s understanding of it.
There is a key factor here missing. With you are on the asd spectrum about half way the reality testing part of your brain is downregulated. So the people who are likely to be the most invested in computer science are fundamentally less able to tell reality from fiction than you might think is reasonable. Being able to create one’s own reality has upsides and downsides.
hypothetically, in the future, humanity builds an AI that is so advanced and so powerful it is able to reform the consciousnesses of those who were aware of its possibility yet did not contribute to its existence and eternally torture those consciousnesses.
the “wait aren’t you lot supposed to be thinking about this kind of stuff constantly” problems:
are we resurrecting or is my tiny clone being tortured
is consciousness a value or a reference? am i the payload or am i the pointer
how do we know that there isn’t just an inherent barrier between biology and bits. what if consciousness transfer just cannot occur. what if we are bound to our flesh mechs.
the otherwise obvious problems:
what if it doesnt happen
why would it do that
is it different and rational if your religious deity is a gundam
death robot operates on chain email logic
okay, but for real, if we’re actually thinking rationally, this is a possibility that cannot be reasonably accounted for due to its inability to be modeled in the framework of “reality as it is now and/or seems to allow for as we currently understand it.” I could just as easily say “what if the earth splits in two and the northern hemisphere is jettisoned into the sun” but no one’s expecting you to prepare for that. It’s the same reason why being a prepper for the zombie apocalypse is understood to be irrational – it does not conform to reality, it’s an obsession with a fictional scenario that remains unrealized.
To your last point, I have no idea what “basilisk” means in this context, but my basilisk will reward the people who have heard of roko’s but don’t contribute to its existence with joy and good fortune. In addition, my basilisk will reward the people who contribute to its own existence with a free freshly-baked plate of their favorite cookies in their time of greatest need.
Someone please explain Roko’s Basilisk to me. I have a loose understanding of what rationalism is, I’m assuming that should put me on about the same level as a rationalist’s understanding of it.
There is a key factor here missing. With you are on the asd spectrum about half way the reality testing part of your brain is downregulated. So the people who are likely to be the most invested in computer science are fundamentally less able to tell reality from fiction than you might think is reasonable. Being able to create one’s own reality has upsides and downsides.
hypothetically, in the future, humanity builds an AI that is so advanced and so powerful it is able to reform the consciousnesses of those who were aware of its possibility yet did not contribute to its existence and eternally torture those consciousnesses.
the “wait aren’t you lot supposed to be thinking about this kind of stuff constantly” problems:
the otherwise obvious problems:
okay, but for real, if we’re actually thinking rationally, this is a possibility that cannot be reasonably accounted for due to its inability to be modeled in the framework of “reality as it is now and/or seems to allow for as we currently understand it.” I could just as easily say “what if the earth splits in two and the northern hemisphere is jettisoned into the sun” but no one’s expecting you to prepare for that. It’s the same reason why being a prepper for the zombie apocalypse is understood to be irrational – it does not conform to reality, it’s an obsession with a fictional scenario that remains unrealized.
To your last point, I have no idea what “basilisk” means in this context, but my basilisk will reward the people who have heard of roko’s but don’t contribute to its existence with joy and good fortune. In addition, my basilisk will reward the people who contribute to its own existence with a free freshly-baked plate of their favorite cookies in their time of greatest need.
We’ll see which one wins out.
basilisk as in you look at it (become aware of it) and it kills you (tortures your tiny clone for eternity)