• bleistift2@feddit.de
    link
    fedilink
    English
    arrow-up
    30
    ·
    1 year ago

    Isn’t it a good thing for pedophiles to have an outlet for their desires that doesn’t involve harming children? Am I not seeing an obvious downside?

    • PorkRollWobbly@lemmy.ml
      link
      fedilink
      arrow-up
      36
      ·
      1 year ago

      Pedophilia is not a sexuality and CSAM, AI generated or not, is not a healthy outlet. Pedophilia should be treated as a disease, and pedophiles should receive treatment for that instead.

      • idunnololz@lemmy.world
        link
        fedilink
        arrow-up
        8
        ·
        edit-2
        1 year ago

        AFAIK you can’t “cure” pedophilia the same way you can’t cure homosexuality. The best you can do is teach people not to act on their desires.

      • bleistift2@feddit.de
        link
        fedilink
        English
        arrow-up
        3
        ·
        1 year ago

        pedophiles should receive treatment for that instead

        In a world where many people cannot afford basic healthcare or – if they can afford it – where healthcare isn’t available in the required quantity, does your argument still hold?

    • PotatoKat@lemmy.world
      link
      fedilink
      arrow-up
      13
      ·
      1 year ago

      If I’m not mistaking I remember reading that consuming CSAM increases the likelihood of offense since it normalizes the act/makes the fantasies more vivid. It makes them more want to act out what they see instead of removing desires.

      • LinkOpensChest.wav
        link
        fedilink
        arrow-up
        1
        ·
        edit-2
        1 year ago

        This would explain why the people with pedophilic tendencies in the facility where my brother worked were recommended not to even view certain non-sexual media involving children. The doctors explained that doing so would only reinforce their pedophilia.

        In spite of how many people I see online repeating the myth that “providing an outlet” decreases the chance of offending, the experts in the field appear to disagree, and I’ve not seen any evidence that providing access to simulated CSAM has any benefits whatsoever. It seems most likely that it’s detrimental, no matter how many professional redditeurs tell us they’re “just pixels on the screen.”

        Edit: After reading further in this thread, I feel the need to mention that unhinged violence against non-offending pedophiles also does not solve anything. Some of you need help.

    • klingelstreich@feddit.de
      link
      fedilink
      arrow-up
      8
      ·
      1 year ago

      It depends on whether you hold a world view where every person is valuable and needs help and understanding to become their best self or one where there are good and bad people and the baddies need to be punished and locked away so everyone else can live their life in peace.

            • Norgur@kbin.social
              link
              fedilink
              arrow-up
              2
              ·
              1 year ago

              That’s a rather useless contribution to the discussion. The initial argument was a line of reasoning why artificial csam might be a benefit so people can vent their otherwise harmful behavior without harming actual people. You just flat out responded “it is enabling and doesn’t stop distribution”. So you just responded with “no, u wrong”. Care to tell us you reasons behind your stance?

            • Kusimulkku@lemm.ee
              link
              fedilink
              arrow-up
              1
              ·
              1 year ago

              I’m not saying it’s better alternative, I’m saying it might not make sense to talk about it “involving minors”.

                • Norgur@kbin.social
                  link
                  fedilink
                  arrow-up
                  1
                  ·
                  1 year ago

                  That’s not picky about wording.
                  While I agree that stuff like that should not exist at all in no way whatsoever, there is a vast difference between it existing because someone abused a child, recorded that and thus scarred the child for life, or if someone made a computer make up pixels in a way that is disgusting.

  • OsrsNeedsF2P@lemmy.ml
    link
    fedilink
    arrow-up
    10
    ·
    edit-2
    1 year ago

    On one hand, yes, but on the other, Stable Horde developed a model to detect CSAM thanks to Stable Diffusion, and that’s being used to combat pedos globally

  • neuropean@kbin.social
    link
    fedilink
    arrow-up
    5
    ·
    1 year ago

    What’s interesting is that mammals from mice to dogs don’t draw a distinction between arbitrary ages before trying to copulate. On the other hand, they don’t try to fuck the equivalent of pre-pubescent members of their species either, nothing natural about that.