the-podcast guy recently linked this essay, its old, but i don’t think its significantly wrong (despite gpt evangelists) also read weizenbaum, libs, for the other side of the coin

  • KobaCumTribute [she/her]@hexbear.net
    link
    fedilink
    English
    arrow-up
    2
    ·
    7 months ago

    I didn’t say “you are perfectly happy and have no material problems whatsoever dealing with a traumatic injury and imperfect replacement,” but rather that this doesn’t represent some sort of fundamental loss of self or unmooring from material contexts. People can lose so much, can be whittled away to almost nothing all the way up to the point where continued existence becomes materially impossible due to a loss of vital functions, but through that they still exist, they remain the same ongoing being even if they are transformed by the trauma and become unrecognizable to others in the process.

    And would an upload be a person, legally speaking? Would your family consider the upload to be a person? That’s pretty alienating.

    If you suffer a traumatic brain injury and lose a large chunk of your brain, that’s going to seriously affect you and how people perceive you, but you’re still legally the same person. If instead that lost chunk was instead replaced with a synthetic copy there may still be problems but less so than just losing it outright. So if that continues until you’re entirely running on the new synthetic replacement substrate, then you have continued to exist through the entire process just as you continue to exist through the natural death and replacement of neurons - for all we know copying and replacing may not even be necessary compared to just adding synthetic bits and letting them be integrated and subsumed into the brain by the same processes by which it grows and maintains itself.

    A simple copy taken like a photograph and then spun up elsewhere would be something entirely distinct, no more oneself than a picture or a memoir.

    • Frank [he/him, he/him]@hexbear.net
      link
      fedilink
      English
      arrow-up
      2
      ·
      7 months ago

      Eh. I’d argue that in as much as “you” means anything, forks would both be equally the person, there’s no “original” who is more the person. It’s a point of divergence, both have equal claim to all experiences and history up to the point of divergence. Privileging the “original” over the “copy” is cultural prejudice, subjectively they’re the same person to the moment of divergence.

      • KobaCumTribute [she/her]@hexbear.net
        link
        fedilink
        English
        arrow-up
        1
        ·
        7 months ago

        I don’t think that’s the right way to untangle that dilemma ethically, because it can lead people to jump to galaxy brained “solutions” like “but what if you can make sure only one of you exists at once?” that don’t make any sense or answer anything but are still active cognitohazards for some people.

        You, as in the one that is in there right now, that instance would continue along its own discrete path for as long as it exists: if another instance were made and separated off that would be a person, that would be a non-contiguous you, but it would not be the same you that is there right now, a distinction that becomes important when dealing with cognitohazards like trying to terminate that instance as the new one is spun up so that “you” get to be the one in a machine instead and there’s no perceptual break between them.

        I’d argue that the ethical way to deal with forking copies like that would be to find ways to keep them linked up and at least partially synced, effectively making them component parts of a larger contiguous whole instead of just duplicating someone in a way that inevitably means at least one of the copies gets fucked over by whatever circumstances inspired the copying. So instead of the you that’s here now and the you spun up in a decade on a computer, there’d be the you that’s here now and then also a new secondary brain that’s on that computer, both of which communicate, share some sensory data, and operate almost as if you’d just added more hemispheres to your brain. And at some point after that maybe you could start considering individual copies ablative the same way bits of brain are, things you don’t want to lose but which you can survive losing and can potentially repair and replace given time because of how redundant and distributed brain functions are.

    • queermunist she/her@lemmy.ml
      link
      fedilink
      English
      arrow-up
      1
      ·
      7 months ago

      What I’m trying to say is your full body prosthetic would need to look like you, feel like you, sound like you, and have a legal life like you. Imagine if your name was Unit 69420, you looked and sounded like a Star Wars droid, and were legally considered property instead of a person. I think you would definitely experience a fundamental loss of self and become unmoored from material contexts.

      • KobaCumTribute [she/her]@hexbear.net
        link
        fedilink
        English
        arrow-up
        4
        ·
        7 months ago

        “If shitty things happen to you, then you will not like that and it will suck,” still doesn’t break the continuity of self. Fundamentally that same exact thing can happen to the current flesh and blood you and it would be horrible and destructive: you can be disfigured through an accident or through someone’s cruelty, you can be locked in a cage and dehumanized on the whim of the state-sanctioned professional violence men and given a farce of a trial by the dysfunctional shitshow that is the legal system, etc, but no one is going to argue that shitty things happening to you ontologically unpersons you in some sort of mystical fashion.

        You can be reduced, you can be transformed, but you continue existing for as long as vital functions do. Talk about someone becoming someone else, or dying in truth long before they died in body, those are just poetic attempts at articulating sorrow and loss.

        • queermunist she/her@lemmy.ml
          link
          fedilink
          English
          arrow-up
          1
          ·
          edit-2
          7 months ago

          So I never was arguing that an upload becomes unpersoned by trauma. My point, the point of the article, is that by merely focusing on the brain we miss the other things that make us who we are.

          The goal of an upload is to transfer the self to a machine, right? Well, parts of your self exist outside of your brain. It’s no different than if an upload was missing parts of the brain. They’re incomplete.

          All that means is for some hypothetical future mind uploading technology, the process would need to include elements of the body and social life and society. Otherwise we’re not complete.

          I am not my brain. I am my brain, my body, my social life, my place in history, etc. I am the dialectical relationship between the personal and the impersonal.