‘Nudify’ Apps That Use AI to ‘Undress’ Women in Photos Are Soaring in Popularity::It’s part of a worrying trend of non-consensual “deepfake” pornography being developed and distributed because of advances in artificial intelligence.

  • @Crow@lemmy.world
    link
    fedilink
    English
    12611 months ago

    I remember being a dumb & horny kid and Photoshopping my crush’s face onto a porn photo. And even then I felt what I did was wrong and never did it again.

    • @CleoTheWizard@lemmy.world
      link
      fedilink
      English
      2711 months ago

      I feel like what you did and the reaction you had to what you did is common. And yet, I don’t feel like it’s harmful unless other people see it. But this conversation is about to leave men’s heads and end up in public discourse where I have no doubt it will create moral or ethical panic.

      A lot of technology challenges around AI are old concerns about things that we’ve had access to for decades. It’s just easier to do this stuff now. I think it’s kind of pointless to stop or prevent this stuff from happening. We should mostly focus on the harms and how to prevent them.

      • @azertyfun@sh.itjust.works
        link
        fedilink
        English
        1611 months ago

        I’ve seen ads for these apps on porn websites. That ain’t right.

        Any moron can buy a match and a gallon of gasoline, freely and legally, and that’s a good thing. But I would hope that anyone advertising and/or selling Arson Kits™ online would be jailed. Of course this will not stop determined arsonists, but repression might deter morons, inventive psychopaths, and overly impulsive people (especially teenagers!) from actually going through with a criminal act. Not all of them. But some/most of them. And that’s already a huge win.

        • @KairuByte@lemmy.dbzer0.com
          link
          fedilink
          English
          11
          edit-2
          11 months ago

          I mean, you’ve been able to do a cursory search and get dozens of “celeb lookalike” porn for many years now. “Scarjo goes bareback” isn’t hard to find, but that ain’t Scarjo in the video. How is this different?

          Edit: To be clear, it’s scummy as all fuck, but still.

          • shuzuko
            link
            fedilink
            English
            1511 months ago

            This is different because, to a certain extent, people in the public eye can expect, anticipate, and react to/suppress this kind of thing. They have managers and PR people who can help them handle it in a way that doesn’t negatively affect them. Billy’s 13 year old classmate Stacy doesn’t have those resources and now he can do the same thing to her. It’s on a very different level of harm.

            • @KairuByte@lemmy.dbzer0.com
              link
              fedilink
              English
              511 months ago

              Billy doesn’t need a nudify app to imagine Stacy naked. Not to mention, images of a naked 13 year old are illegal regardless.

              • @azertyfun@sh.itjust.works
                link
                fedilink
                English
                711 months ago

                Why are you pretending that “nudify apps” are produce ephemeral pictures equivalent to a mental image? They are most definitely not.

                Underage teenagers already HAVE shared fake porn of their classmates. It being illegal doesn’t stop them, and as fun as locking up a thirteen year old sounds (assuming they get caught, prosecuted, and convicted) that still leaves another kid traumatized.

                  • @azertyfun@sh.itjust.works
                    link
                    fedilink
                    English
                    211 months ago

                    Go after the people advertising those apps. Developers and advertisement agencies who say/intentionally imply “create naked pictures of people you know” should all be prosecuted.

                    Unlike photoshop or generic SD software, these apps have literally no legitimate reason to exist since the ONLY thing they facilitate is creating non-consensual pornography. Seems like something that would be very easy to criminalize.

              • @Sweetpeaches69@lemmy.world
                link
                fedilink
                English
                211 months ago

                Just as the other people in this made up scenario don’t need an app to imagine Scarlet Johansen naked. It’s a null point.

                • @CleoTheWizard@lemmy.world
                  link
                  fedilink
                  English
                  711 months ago

                  I think most of this is irrelevant because the tool that is AI image generation is inherently hard to limit in this way and I think it will be so prevalent as to be hard to regulate. What I’m saying is: we should prepare for a future where fake nudes of literally anyone can be made easily and shared easily. It’s already too late. These tools, as was said earlier, already exist and are here. The only thing we can do is severely punish people who post the photos publicly. Sadly, we know how slow laws are to change. So in that light, we need to legislate based on long term impact instead of short term reactions.

                • @KairuByte@lemmy.dbzer0.com
                  link
                  fedilink
                  English
                  211 months ago

                  And?… There’s a major difference between “a lookalike of a grown adult” and “ai generated child porn” as im sure you’re aware. At no point did anyone say child porn was going to be legal, until the person I was replying to brought it up as a strawman argument. ¯\_(ツ)_/¯