• Mac@mander.xyz
    link
    fedilink
    English
    arrow-up
    8
    ·
    edit-2
    1 day ago

    If only anyone could have predicted this!
    Musk keeps making money, though, so it will keep happening.

  • [deleted]@piefed.world
    link
    fedilink
    English
    arrow-up
    37
    ·
    edit-2
    2 days ago

    This is absolutely horrible and I wish the media would stop working things as if genAI have xray specs that can reveal what is underneath clothing.

    Someone used AI to remove her clothing in the original photo.

    Someone used AI to generate a hypothetical nude based on the original photo. They made fake nudes of her for the purpose of harassment, yes, but they did not magically create actual nudes of her. The punishment should be the same as releasing nudes against her will, but terminology like ‘remove clothes’ makes it sound like genAI can magically create real nudes of someone ir imply there are nudes of that person somewhere for the genAI to use for this purpose.

    Part of the reason I think this is an important distinction is because genAI can also put people in situations thry have never been in as well. It is entirely fictional, and shouldn’t be worded as doing something that can occur in real life.

    • Jo Miran@lemmy.ml
      link
      fedilink
      English
      arrow-up
      17
      ·
      2 days ago

      I genuinely wish I could enthusiastically agree with you. I won’t go into detail but a friend with whom I have spent a lot of time naked and whose body I know very well had this done to them by one of her students (middle schoolers are the worst). I guess if you feed the AI enough pictures, from enough angles, wearing different outfits, it can vomit a fairly good guess. Excluding birthmarks on the abdomen, a small tattoo on the pubic region, and the nipples/areolas, it was REALLY fucking close.

      The reason this matter is because the level of detail and accuracy differentiates this assault to things like photoshopping someone’s head on a random nude. This is much, much worse and I think I support treating it as illegally removing someone’s clothes.

      Some creep could copy and paste your head on some random nude photo and jack off to that. That’s bad, but imagine that same creep (who you don’t know is a creep) takes photos of you – has you pose for photos – then takes them and “nudifies” them. They can pleasure themselves to photos of you they themselves took. That is absolutely horrifying in my opinion.

      PS: My apologies if the above is triggering to anyone here, but I think it is important to not undersell this type of sexual assault.

    • KairuByte@lemmy.dbzer0.com
      link
      fedilink
      English
      arrow-up
      3
      ·
      1 day ago

      I feel the need to point out that enough shots from enough angles in anything other than multiple layers or sweats, is going to essentially result in an “xray” effect. Yeah, it won’t know the exact hue of your nipples, or the precise single of your dangle, but it’s going to be close enough to be considered enough to end a career.

        • KairuByte@lemmy.dbzer0.com
          link
          fedilink
          English
          arrow-up
          1
          ·
          20 hours ago

          I’m not sure I get what your comment is referencing. If you feel enough data into an “ai” meant to generate lifelike images, you’re going to get a relatively close approximation of the person. Will it be exact? Absolutely not. But again, it will be enough to put your job in danger in many situations.

          • The Octonaut@mander.xyz
            link
            fedilink
            English
            arrow-up
            2
            ·
            20 hours ago

            The people - very, very many of them literal school children - doing this are not training image AI models or even LoRAs or whatever on their home servers by feeding them images of a person from multiple angles and different parts exlosed. They’re just taking a single image and uploading it to some dodgy Android store app or, y’know, Grok. Which then colours in the part it identifies as clothes with a perfectly average image from the Internet (read: heavily modified in the first place and skewed towards unrealistic perfection). The process is called in-painting. The same models use the same technique if you just want to change the clothes, and people find that a brief amusement. If you want to replace your bro’s soccer jersey with a jersey of a team he hates to wind him up, you are not carefully training the AI to understand what he’d look like in that jersey. You just ask the in-painter to do it and assuming it already has been fed what the statistical average combination of pixels for “nude girl” or “Rangers jersey” are, it applies a random seed and starts drawing, immediately and quickly.

            That’s the problem. It has always been possible to make a convincing fake nude of someone. But there was a barrier to entry - Photoshop skills, or paying someone for photoshop skills, time, footprint (you’re not going to be doing this on dad’s PC).

            Today that barrier to entry is reduced massively which has put this means of abuse in the hands of every preteen with a smartphone, and in a matter of seconds. And then shared with all your peer group, in a matter of seconds.

            It’s the exact same logic which means that occasionally I find a use for image generation tools. Yes I can probably draw an Orc with a caltrop stuck up his nose, but I can’t do that mid-session of D&D and if it’s for a 10 second bit, why bother. Being able to create and share it within seconds is a large part of the selling point of these tools. Did I just steal from an artist? Maybe. Was I going to hire an artist to do it for me? No. Was I going to Google the words “orc” and “caltrop” and overlay the results for a cheap laugh? Maybe. Is that less stealing? Maybe. Am I getting way off the point that these people aren’t training image generation AIs with fragments of photos in order to make a convincing fake? Yes.

            • KairuByte@lemmy.dbzer0.com
              link
              fedilink
              English
              arrow-up
              1
              ·
              19 hours ago

              I… I don’t think it’s children creating deepfake nudes of people and posting them to onlyfans.

              I do get your point that most of the time people aren’t training a model on a plethora of images, but it isn’t that difficult a thing to do. It’s more complicated than asking Grok to “take this single shot and make them naked” but it isn’t something you can’t do in under a day of research if your mind is set to it.