‘Nudify’ Apps That Use AI to ‘Undress’ Women in Photos Are Soaring in Popularity::It’s part of a worrying trend of non-consensual “deepfake” pornography being developed and distributed because of advances in artificial intelligence.

    • @TORFdot0@lemmy.world
      link
      fedilink
      English
      1911 months ago

      Not all nudity is but there is no non-sexual reason to use AI to undress someone without consent

      • @Eezyville@sh.itjust.works
        link
        fedilink
        English
        811 months ago

        The question on consent is something I’m trying to figure out. Do you need consent to alter an image that is available in a public space? What if it was you who took the picture of someone in public?

        • @TORFdot0@lemmy.world
          link
          fedilink
          English
          811 months ago

          Keep in mind there is a difference between ethical and legal standards. Legally you may not need consent to alter a photo of someone unless it was a copyrighted work possibly. But ethically it definitely requires consent, especially in this context

          • @Eezyville@sh.itjust.works
            link
            fedilink
            English
            311 months ago

            The difference between legal and ethical is one could get you fined or imprisoned and the other would make a group of people not like you.

    • @Pyr_Pressure@lemmy.ca
      link
      fedilink
      English
      2
      edit-2
      11 months ago

      Just because something shouldn’t be doesn’t mean It won’t be. This is reality and we can’t just wish something to be true. You saying it doesn’t really help anything.

      • @lolcatnip@reddthat.com
        link
        fedilink
        English
        4
        edit-2
        11 months ago

        Whoooooosh.

        In societies that have a healthy relationship with the human body, nudity is not considered sexual. I’m not just making up fantasy scenarios.