• KoboldCoterie@pawb.social
    link
    fedilink
    English
    arrow-up
    136
    ·
    7 months ago

    Documentaries often include recreations of events, such as historical events that weren’t filmed. It’s usually noted as being a recreation or re-enactment. If AI-created images are used instead and are noted as being such, I don’t really see the problem, assuming the images are curated to depict the scene accurately.

    • DdCno1@kbin.social
      link
      fedilink
      arrow-up
      52
      ·
      7 months ago

      The problem in both cases is that people remember these artistic depiction as real, even if there’s a disclosure.

      • db2@lemmy.world
        link
        fedilink
        English
        arrow-up
        29
        ·
        7 months ago

        Are we worrying about the fully functional adults that still need to be told not to drink Draino?

      • Blóðbók@slrpnk.net
        link
        fedilink
        English
        arrow-up
        1
        ·
        7 months ago

        That argument extends to any realistic recreation of events. It’s not wrong, I’m just not sure what could be done about it.

    • ringwraithfish@startrek.website
      link
      fedilink
      English
      arrow-up
      24
      ·
      7 months ago

      This is how I’m leaning too. If done appropriately this should be no different than “this is a reenactment of events” seen in 90s and 00s true crime shows.

      The big challenge is getting the content creators to respect that template and not bury the disclosure in the credits.

    • just_another_person@lemmy.world
      link
      fedilink
      English
      arrow-up
      6
      ·
      7 months ago

      A recreation is a scripted recreation, and I believe legally required to be noted as such. Whether that’s in the credits or on screen at time of playing I think is at the discretion of the filmmaker and editors.

      Wildly different concept than generative AI models doing whatever they feel. At the end of the day, I can see why some people can’t see the difference, but it’s huge. I’d also say that if the former were improperly used in a horrific way, you’d just say “Well the viewers can stay away from that documentary”, but as we we’ve all seen over the past decade or so, once the falsely represented account of events is out there, you can’t stop it from spreading. Whether is a still image, or a reenactment. One has current legal repercussions and is covered by libel and slander protections, and the other doesn’t. World of difference.

      • NegativeInf@lemmy.world
        link
        fedilink
        English
        arrow-up
        7
        ·
        7 months ago

        I… I don’t think they are generating the history on the fly for each individual playback. Probably just generating images based on the concept, iteratively tweaking until it conveys the message that is desired by the artist. You know. Like most artistic works. AI is another tool.

        Not to say training data being copped from hardworking artists is good, but an ethically trained AI for image generation for this context is not necessarily evil if it is used in the context of executing the artist’s vision in the way they deem necessary and sufficient. Relying on outside people can often cloud the vision of a project.

        That being said, pay artists for their work, license if you want to train, and credit/royalties should be paid until copyright expires or the rights are purchased outright for a competitive compensation.

        • just_another_person@lemmy.world
          link
          fedilink
          English
          arrow-up
          3
          ·
          7 months ago

          The point is more that false “recreations” are protected when you have a planned and scripted setup to film and display it. Generative AI is not included in those laws yet, which is why everyone is trying to get their bullshit in while they can.