The creation of sexually explicit “deepfake” images is to be made a criminal offence in England and Wales under a new law, the government says.

Under the legislation, anyone making explicit images of an adult without their consent will face a criminal record and unlimited fine.

It will apply regardless of whether the creator of an image intended to share it, the Ministry of Justice (MoJ) said.

And if the image is then shared more widely, they could face jail.

A deepfake is an image or video that has been digitally altered with the help of Artificial Intelligence (AI) to replace the face of one person with the face of another.

  • HexesofVexes
    link
    fedilink
    307 months ago

    This is a tricky one - it’s a law that’s setting out to protect folks, but it’s also laying some dangerous groundwork.

    • BreakDecks
      link
      fedilink
      English
      87 months ago

      Can you explain more on that? This sounds perfectly reasonable to me.

      • HexesofVexes
        link
        fedilink
        287 months ago

        So, the basis of this law is a person’s facial/physical parameters were used to make vulgar videos/images. This caused them distress, and so this is illegal. At this point, it is aiming to protect people from something deemed traumatic, and the law specifically requires the intent to be “to distress”. It’s a good law.

        Let’s say someone hand draws a vulgar image using my physical parameters that I find distressing. Is that illegal? At the moment, no, however it’s not too great a step to see it pushed through by a similar argument (what if the artist was REALLY good, and they intended to distress me). And from here we go…

        What about a cartoon or caricature? Could someone draw an image of the UK PM performing oral sex to billionaires and fall afoul if the subjects find it distressing? Surely such a cartoon cannot help but intend to dismay or distress the subjects?

        Does it have to be vulgar or just cause distress? Could they just mock a person using an image? Mockery certainly sums to distress?

        Does it have to be an image, or are distressing written pieces also viable?

        It shifts towards the idea that “artistic” creations that provoke distress to an individual ought to be illegal. This is a viewpoint I stand strongly against.

        However, this law is groundwork. Groundwork can also push towards a lot of good. Then again, how much do you trust UK politicians to make informed internet laws?

        • BreakDecks
          link
          fedilink
          English
          37 months ago

          You make good points. I think applying the law differently based on whether the victim was a private person or a public figure (much in the way libel/slander laws do - at least in the USA) would be a good measure to ensure that free speech is preserved, while innocent people are protected.

          Might not be so easy in the UK, I can’t say I know a lot about the application of the law in this regard. Though given the Horizon scandal, I definitely lean towards not trusting UK politicians to make informed laws regulating computers…

          • HexesofVexes
            link
            fedilink
            37 months ago

            That may be a good compromise, though it does provoke the question as to how public a figure one must be before one waives the right to not have malice directed at them.

            I think my main qualm is that well meaning laws can often set a precedent for unforseen issues down the road, and the curtailment of any liberty, no matter how vile that liberty is, must be done with care to avoid creating traps for the future. Something UK politicians are famous for failing to do (our digital safety laws are very much responsible for our lack of privacy, and have created some of the most dangerous data troves on the planet!).

      • It just feels like the wrong approach to me.

        Who is to say an image depicts whatever person. Sure a court / jury can try to decide, but I suspect that it’s possible to make deepfakes look enough like the victim to do the damage, while dissimilar enough that there’s reasonable doubt that the depiction was intended to represent the victim.

        I feel like society needs to adapt to this. To me, in the age of deepfakes, there shouldn’t be any shame in being depicted in a video, rather the people sharing it and watching it.

        Like if someone said they saw a naughty video of me, I don’t really see why I should be ashamed, but I world ask them what the hell they’re doing watching it.

        • @Woozythebear@lemmy.world
          link
          fedilink
          27 months ago

          It’s not about shame… like wtf? People’s careers can be affected. In a day and age where corporations request access to all your social media and do extensive background/social media checks it’s likely they may stumble upon a deep fake porn video of you and refuse to hire you over it.

          Teachers are being fired for having only fans, what do you think is gonna happen when deep fake porn videos of teachers start to pop up and parents start demanding the teacher be fired?

          • Ok, semantics. “Shame” may not be the best word for it but it’s the best I can think of.

            Not as in “I feel ashamed about this video” but rather people thinking you ought to feel ashamed.

            In a day and age where […]

            In a soon to be day and age where you can tell your VR goggles “show me a video of that person at the library today”, we shouldn’t be judging that person at the library.