THE SENATE UNANIMOUSLY passed a bipartisan bill to provide recourse to victims of porn deepfakes — or sexually-explicit, non-consensual images created with artificial intelligence.

The legislation, called the Disrupt Explicit Forged Images and Non-Consensual Edits (DEFIANCE) Act — passed in Congress’ upper chamber on Tuesday.  The legislation has been led by Sens. Dick Durbin (D-Ill.) and Lindsey Graham (R-S.C.), as well as Rep. Alexandria Ocasio-Cortez (D-N.Y.) in the House.

The legislation would amend the Violence Against Women Act (VAWA) to allow people to sue those who produce, distribute, or receive the deepfake pornography, if they “knew or recklessly disregarded” the fact that the victim did not consent to those images.

  • JovialMicrobial@lemm.ee
    link
    fedilink
    arrow-up
    3
    ·
    4 months ago

    You can tell when a painting is a painting. Even it’s photorealistic. You know a person created it and that it’s fiction. Often these are hung in galleries where people expect to see art.

    Deep fakes exist to fool people into thinking someone did something(like pornography) when they didn’t…usually with the intention of causing harm to their reputation. That’s already illegal due to defamation laws, so really it’s just an extention of those combined with revenge porn laws.

    The reason they have to include the type of tech in the law is because that tech made it possible for unskilled bad actors to get on it…therefore there’ll be more people committing these types of crimes against others. It’s a good thing they’re addressing this issue.

    • Contravariant@lemmy.world
      link
      fedilink
      arrow-up
      3
      ·
      edit-2
      4 months ago

      The reason they have to include the type of tech in the law is because that tech made it possible for unskilled bad actors to get on it

      Yeah, and that’s the part I don’t like. If you can’t define why it’s bad without taking into account the skill level of the criminal then I’m not convinced it’s bad.

      As you point out defamation is already illegal and deliberately spreading false information about someone with the intent to harm their reputation is obviously wrong and way easier to define.

      And is that not why you consider a painting less ‘bad’? Because it couldn’t be misconstrued as evidence? Note that the act explicitly says a digital forgery should be considered a forgery even when it’s made abundantly clear that it’s not authentic.

      • JovialMicrobial@lemm.ee
        link
        fedilink
        arrow-up
        5
        ·
        4 months ago

        Look, I’m a professional artist. The general rule is you have to change something 15% to 30%(depending on location) for it to not come into violation of copyright laws. That’s why you see satirical depictions of brands in cartoons and such.

        This new law has to take into consideration art laws, defamation laws, revenge porn laws, slander laws, and the right for a person to own their likeness.

        It is absolutely necessary to reign this in before serious harm is done to someone. The point of writing a law to address this specific issue is because for the law actually be effective, it must be written to address the specific problem this technology presents. I listed the other laws to show its consistent with ones we already have. There’s nothing wrong with adding in another to protect people.

        As for the unskilled part, the point of that is a skilled person creating deepfake porn by hand, frame by frame should get in as much trouble as an unskilled person using ai. The AI is just going to make it so more unethical people are making this crap…so more if it will exist. That’s a problem that needs addressing.

        You have a nice day now.