All 50.

At the same time.

In lockstep.

Of course. lol

  • Burger@burggit.moe
    link
    fedilink
    English
    arrow-up
    10
    ·
    edit-2
    1 year ago

    …This probably won’t stop here unfortunately. Once they start trying to regulate this stuff, they’ll keep pushing for more restrictions, especially if you can generate images/video of celebrities/our political elite saying ‘nigger.’ But yeah, think of the fictional children!

    To address child exploitation, we already have laws on the books that make it illegal to possess child pornography so if such a thing is used in/to generate a model, then you’re already getting in trouble for that. There’s no reason to make more unnecessary laws. Which goes right back to what I said earlier in this post. Pure trojan horse, plain and simple.

    • Mousepad@burggit.moe
      link
      fedilink
      English
      arrow-up
      3
      ·
      1 year ago

      Once they start trying to regulate this stuff, they’ll keep pushing for more restrictions, especially if you can generate images/video of celebrities/our political elite saying ‘nigger.’

      Okay to be fair, that is a real problem. I, and I imagine most people, are very uncomfortable with indistinguishable-from-reality deepfakes. The implications for spread of misinformation is only a small part of the problem.

      • Burger@burggit.moe
        link
        fedilink
        English
        arrow-up
        3
        ·
        edit-2
        1 year ago

        There’s nothing you can do about it though without banning it/making it so that only people with special licenses for academic/business use can generate that stuff. If someone makes their software to where it embeds a watermark/some kind of metadata that helps you verify that what you’re watching is fake, someone else will just release their own software/model that doesn’t do that.

        This is either an all or nothing kind of deal.

        • Mousepad@burggit.moe
          link
          fedilink
          English
          arrow-up
          2
          ·
          1 year ago

          Yeah, I don’t have a good solution. You could just make it illegal to make deepfakes of real people, but that is pretty restrictive (not to mention impossible to enforce).

    • SquishyPillow@burggit.moeOP
      link
      fedilink
      English
      arrow-up
      2
      ·
      edit-2
      1 year ago

      Tinfoil hat tiem!

      Its all to protect the viability of blackmail. Anyone who has been blackmailed via photography, video, or audio will be able to use AI-generated content to plausibly deny the legitimacy of the blackmail material. Those who rely on blackmail for power, wealth, and influence are incentivized to buy as much time as they can. Blackmailed individuals can be leveraged to buy time in various ways, one of which is on display in the headline of this article. Now that many past social taboos have been normalized, what can you blackmail people with in the current year?