‘Nudify’ Apps That Use AI to ‘Undress’ Women in Photos Are Soaring in Popularity::It’s part of a worrying trend of non-consensual “deepfake” pornography being developed and distributed because of advances in artificial intelligence.

  • @TORFdot0@lemmy.world
    link
    fedilink
    English
    1911 months ago

    Not all nudity is but there is no non-sexual reason to use AI to undress someone without consent

    • @Eezyville@sh.itjust.works
      link
      fedilink
      English
      811 months ago

      The question on consent is something I’m trying to figure out. Do you need consent to alter an image that is available in a public space? What if it was you who took the picture of someone in public?

      • @TORFdot0@lemmy.world
        link
        fedilink
        English
        811 months ago

        Keep in mind there is a difference between ethical and legal standards. Legally you may not need consent to alter a photo of someone unless it was a copyrighted work possibly. But ethically it definitely requires consent, especially in this context

        • @Eezyville@sh.itjust.works
          link
          fedilink
          English
          311 months ago

          The difference between legal and ethical is one could get you fined or imprisoned and the other would make a group of people not like you.