• @bleistift2@feddit.de
    link
    fedilink
    English
    308 months ago

    Isn’t it a good thing for pedophiles to have an outlet for their desires that doesn’t involve harming children? Am I not seeing an obvious downside?

    • PorkRollWobbly
      link
      fedilink
      368 months ago

      Pedophilia is not a sexuality and CSAM, AI generated or not, is not a healthy outlet. Pedophilia should be treated as a disease, and pedophiles should receive treatment for that instead.

      • idunnololz
        link
        fedilink
        8
        edit-2
        8 months ago

        AFAIK you can’t “cure” pedophilia the same way you can’t cure homosexuality. The best you can do is teach people not to act on their desires.

        • at_an_angle
          link
          English
          18 months ago

          There is a cure for pedophilia. It’s a 45ACP hollow point to the genital, brain, or both.

      • @bleistift2@feddit.de
        link
        fedilink
        English
        38 months ago

        pedophiles should receive treatment for that instead

        In a world where many people cannot afford basic healthcare or – if they can afford it – where healthcare isn’t available in the required quantity, does your argument still hold?

    • @PotatoKat@lemmy.world
      link
      fedilink
      138 months ago

      If I’m not mistaking I remember reading that consuming CSAM increases the likelihood of offense since it normalizes the act/makes the fantasies more vivid. It makes them more want to act out what they see instead of removing desires.

      • LinkOpensChest.wav
        link
        1
        edit-2
        8 months ago

        This would explain why the people with pedophilic tendencies in the facility where my brother worked were recommended not to even view certain non-sexual media involving children. The doctors explained that doing so would only reinforce their pedophilia.

        In spite of how many people I see online repeating the myth that “providing an outlet” decreases the chance of offending, the experts in the field appear to disagree, and I’ve not seen any evidence that providing access to simulated CSAM has any benefits whatsoever. It seems most likely that it’s detrimental, no matter how many professional redditeurs tell us they’re “just pixels on the screen.”

        Edit: After reading further in this thread, I feel the need to mention that unhinged violence against non-offending pedophiles also does not solve anything. Some of you need help.

    • @klingelstreich@feddit.de
      link
      fedilink
      88 months ago

      It depends on whether you hold a world view where every person is valuable and needs help and understanding to become their best self or one where there are good and bad people and the baddies need to be punished and locked away so everyone else can live their life in peace.

            • Norgur
              link
              fedilink
              28 months ago

              That’s a rather useless contribution to the discussion. The initial argument was a line of reasoning why artificial csam might be a benefit so people can vent their otherwise harmful behavior without harming actual people. You just flat out responded “it is enabling and doesn’t stop distribution”. So you just responded with “no, u wrong”. Care to tell us you reasons behind your stance?

                • @bleistift2@feddit.de
                  link
                  fedilink
                  English
                  18 months ago

                  “it is enabling it doesn’t stop distribution“

                  Norgur’s point is that you didn’t provide any reasoning why that should be the case.

            • @Kusimulkku@lemm.ee
              link
              fedilink
              18 months ago

              I’m not saying it’s better alternative, I’m saying it might not make sense to talk about it “involving minors”.

                • Norgur
                  link
                  fedilink
                  18 months ago

                  That’s not picky about wording.
                  While I agree that stuff like that should not exist at all in no way whatsoever, there is a vast difference between it existing because someone abused a child, recorded that and thus scarred the child for life, or if someone made a computer make up pixels in a way that is disgusting.

      • Deceptichum
        link
        fedilink
        108 months ago

        No, not at all.

        That’s why people like them, you can say make me a photo of a “monkey riding a pickle in space” or “a dog made of cheese” and it’ll make it despite obviously having no reference.

        It only needs to be trained to know what things are, it can mix them freely.

  • @OsrsNeedsF2P@lemmy.ml
    link
    fedilink
    10
    edit-2
    8 months ago

    On one hand, yes, but on the other, Stable Horde developed a model to detect CSAM thanks to Stable Diffusion, and that’s being used to combat pedos globally

  • neuropean
    link
    fedilink
    58 months ago

    What’s interesting is that mammals from mice to dogs don’t draw a distinction between arbitrary ages before trying to copulate. On the other hand, they don’t try to fuck the equivalent of pre-pubescent members of their species either, nothing natural about that.