This was posted on catholic easter sunday on the ssc subreddit. It’s a posted-on-April 1st-for-plausible-deniability siskind post from back in 2018, where he outlines a kind of argument about how an all-powerfull entity that’s God in all but name (and obviously emanated from a culture discovering AGI) is actually “logically necessary”.

He calls the whole thing “The Hour I First Believed”. I think it’s notable for being a bit of a treasure trove of rationalist weird accepted truths, such as:

  • All copies of a consciousness share a self, because consciousness is like an equation, or something:

But if consciousness is a mathematical object, it might be that two copies of the same consciousness are impossible. If you create a second copy, you just have the consciousness having the same single stream of conscious experience on two different physical substrates.

Which is both the original transhumanist cope to enable so-called consciousness upload so it’s not just copying a simulacrum of your personality to a computer while you continue to rot away, and also what makes the basilisk torturing you possible.

  • And it’s corollary, Simulation Capture:

This means that an AI can actually “capture” you, piece by piece, into its simulation. First your consciousness is just in the real world. Then your consciousness is distributed across one real-world copy and a million simulated copies. Then the AI makes the simulated copies slightly different, and 99.9999% of you is in the simulation.

which is a kind of nuts I hadn’t happened upon before.

There’s also a bunch of rationalist decision theory stuff which I think make obvious how they were concocted to serve this type of narrative in the first place, instead for being broadly useful, Yud posing as a decision theory trailblazer notwithstanding.

  • YourNetworkIsHaunted@awful.systems
    link
    fedilink
    English
    arrow-up
    15
    ·
    3 days ago

    The decision theory stuff itself ought to be called out more for playing pretty fast and loose with reality to begin with. “If you have a supercomputer that perfectly simulates blah blah blah” is such a fundamentally bad premise because once you presume such a thing exists you’re committing to the same basic metaphysical problems that you would if you replaced the computer with God. In particular I think it commits you to hard determinism at which point there’s no sense arguing about what the right action is because the answer was set in stone not just before you entered the room but when the initial state of the universe was set up. Like, there’s a version of this where the question is meaningful in which case the premise is impossible, and a version where we accept the premise as given and render the question pointless. Why are you doing decision theory in a hypothetical world where nobody really makes decisions?

    Or we could acknowledge that yudkowskian decision theory is just singularity apologetics and accept the impossible elements of the premise on faith.

    • Architeuthis@awful.systemsOP
      link
      fedilink
      English
      arrow-up
      8
      ·
      edit-2
      2 days ago

      Luckily we should be getting trickle down free will, since all universes are (of course) able to develop technology to perfectly simulate universes of lesser complexity, which seems to imply the existence of a special universe of ultimate complexity where all others emanate from, possibly in line with ain soph or equivalent mystical concept.

      I don’t know how that squares with that blabbing about the tegmarkian multiverse that supposedly posits that mathematically simple universes “exist ‘more’”, which siskind probably just included to reinforce his consciousness as a non-physical, mathematical object premise.

      • it_wasnt_arson@awful.systems
        link
        fedilink
        English
        arrow-up
        2
        ·
        23 hours ago

        I continue to be endlessly fascinated by Anathem, by virtue of enjoying it as a kid for the wacky speculative metaphysics, enjoying it as an adult for the case study it presents in how Neal Stephenson can get you nodding along to a set of faux-lectures strung together by road tripping until he gets you to an obviously false conclusion, and now the fact that The Wick is apparently what rationalists actually believe in, just substituting simulations and reality-hacking for quantum woo and nukes? The Incanter Basilisk can entrap your consciousness by manipulating which timelines your brain is quantum-entangled with coexisting copies of your psyche exist in the multimetaverse and selecting among them to give you quantum immortality 51% attack you into the Matrix, I guess.

        • gerikson@awful.systems
          link
          fedilink
          English
          arrow-up
          2
          ·
          15 hours ago

          Been a long time since I read Anathem, and I’m kinda surprised Stephenson hasn’t been outed as an out-and-out Nazi. I mean it’s good if he’s not, just that he seems the type.

          Although I found the terrorist character in REAMDE weirdly well written.

          • Charlie Stross@wandering.shop
            link
            fedilink
            arrow-up
            5
            ·
            edit-2
            14 hours ago

            @gerikson I hung out with Neal a few times and he struck me as the kind of twisty-minded guy who’s unlikely to fall for the simplistic nostrums that typify authoritarian thinkers. Conservative (with some libertarian in his background, I suspect) but not hammer-make-square-peg-fit-in-round-hole stupid. (Disclaimer: I last saw him about a decade ago.)

    • Soyweiser@awful.systems
      link
      fedilink
      English
      arrow-up
      9
      ·
      3 days ago

      On a different note, ‘our god means you have no free will’ is also quite opposed to what I got from Christianity.

        • Soyweiser@awful.systems
          link
          fedilink
          English
          arrow-up
          4
          ·
          edit-2
          1 day ago

          Get that heresy out of here, pope or nope!

          E: to be clear, im talking here about my own personal experience growing up catholic, not talking about all the different types of christianity. Im aware of a few of them, but not as huge on the details, I know they went to different primary schools. (The joys of being Dutch, look up ‘verzuiling’ (pillarisation iirc). Less a thing so now I think, I oddly experienced a vestige of it re primary school).

      • Architeuthis@awful.systemsOP
        link
        fedilink
        English
        arrow-up
        12
        ·
        3 days ago

        Christianity certainly runs the gamut wrt to free will, from it being strictly necessary to explain away the problem of evil to, well, Calvinism.

        • YourNetworkIsHaunted@awful.systems
          link
          fedilink
          English
          arrow-up
          7
          ·
          2 days ago

          I will back up soyweiser here by saying that at least in modern Christianity you run into the latter a hell of a lot less often. I don’t know that most of them have done a lot of theological introspection to try and reconcile the usual contradictions you get from trying to use bronze-age source material dealing in absolutes, but when push comes to shove I think most of them lean towards believing that the choice to be a decent person is real and matters.

          • Soyweiser@awful.systems
            link
            fedilink
            English
            arrow-up
            6
            ·
            edit-2
            2 days ago

            Yeah been a while since I was confirmed (or believed) but the free will part of that was considered important from what I remember. So not only being a decent person, but also the choice to believe in and act on the faith.

            Unrelated, but it is odd how much I have been thinking back about all that the last few years, esp more so the last year. Lot of it is due to the US gov being really keep on getting excommunications back on the menu. (Note this doesn’t make the church good in any way, like even if we ignore the coverup of the pedophilia, they went full anti trans recently).

  • Evinceo@awful.systems
    link
    fedilink
    English
    arrow-up
    13
    ·
    3 days ago

    Which is both the original transhumanist cope to enable so-called consciousness upload so it’s not just copying a simulacrum of your personality to a computer while you continue to rot away

    Often-missed point btw.

  • CinnasVerses@awful.systems
    link
    fedilink
    English
    arrow-up
    10
    ·
    3 days ago

    I am not reading a SlateStar essay early on a Monday, but I think this is a response to Yud’s teaching that a copy of you is really you so Colossus can really bring you back to live in digital heaven / hell. '90s Star Trek had some episodes about ‘what if the transporter makes two copies of you?’ Scott Alexander / SlateScott avoids talking about Yudkowsky’s ideas in detail, I used to think he saw Yudkowsky as someone who got the rubes in the door to hear the good word about race and IQ, but then they worked on AI 2027 together. https://pivot-to-ai.com/2025/08/17/ai-doomsday-and-ai-heaven-live-forever-in-ai-god/

    • Architeuthis@awful.systemsOP
      link
      fedilink
      English
      arrow-up
      9
      ·
      edit-2
      3 days ago

      I think this is a response to Yud’s teaching that a copy of you is really you

      It’s not so much a response as it is just running with it until you hit the concepts of the soul and the godhead face first.

      edit: it’s also mercifully short, like not even 3k words.

      • CinnasVerses@awful.systems
        link
        fedilink
        English
        arrow-up
        6
        ·
        3 days ago

        Although not by accident: Scott Alexander is a practicing Jew and Unsong is the kind of thing that someone interested in theology and Neo-Platonism writes. So I think he knows his friends are recapitulating Christianity, but if he backed away from them over that, they might back away from him reinventing social Darwinism and eugenics.

  • CinnasVerses@awful.systems
    link
    fedilink
    English
    arrow-up
    9
    ·
    edit-2
    3 days ago

    There’s also a bunch of rationalist decision theory stuff which I think make obvious how they were concocted to serve this type of narrative in the first place, instead for being broadly useful, Yud posing as a decision theory trailblazer notwithstanding.

    Anna Salamon talked about that obliquely after CFAR burned out in 2020.

    I think CFAR’s actions were far from the kind of straight-forward, sincere attempt to increase rationality, compared to what people might have hoped for from us, or compared to what a relatively untraumatized 12-year-old up-and-coming-LWer might expect to see from adults who said they were trying to save the world from AI via learning how to think…I didn’t say things I believed false, but I did choose which things to say in a way that was more manipulative than I let on, and I hoarded information to have more control of people and what they could or couldn’t do in the way of pulling on CFAR’s plans in ways I couldn’t predict, and so on.

    Its the same old story as the Libertarians who tell each other they are conning the Liberals, and just have one thing in common with the facists and oligarchs. Most of these people think they are conning everyone around them and can spread their favourite crazy idea and not be infected by everyone else’s.

    • Architeuthis@awful.systemsOP
      link
      fedilink
      English
      arrow-up
      6
      ·
      3 days ago

      I didn’t say things I believed false

      What a peculiar and lawyer-friendly way to say “I didn’t outright lie”.