Disney’s Loki faces backlash over reported use of generative AI / A Loki season 2 poster has been linked to a stock image on Shutterstock that seemingly breaks the platform’s licensing rules regard…::A promotional poster for the second season of Loki on Disney Plus has sparked controversy amongst professional designers following claims that it was created using generative AI.

  • @Iwasondigg
    link
    English
    1689 months ago

    I don’t understand the controversy really. A graphic designer at Disney used stock photography in their design of the poster, that’s pretty normal and extremely common. It turns out that whoever uploaded that stock image to the service used AI to create it, but how is that Disney’s fault? I don’t get it.

    • Patapon Enjoyer
      link
      fedilink
      English
      68
      edit-2
      9 months ago

      AI taking the job of someone else by stealing art aside,

      According to Shutterstock’s contributor rules, AI-generated content is not permitted to be licensed on the platform unless it’s created using Shutterstock’s own AI-image generator tool.

      The picture was not flagged as AI, so it was sold as real art against their TOS.

      I don’t think the artists or even the studio did this maliciously, but there needs to be discussion on how stock art should be vetted when used like this

      • P03 Locke
        link
        fedilink
        English
        179 months ago

        Can we talk about how Shutterstock only allows their own AI-generated images? Stock image sites will be the first to face the guillotine of AI generation, and this is how they protect themselves?

        Good riddance. I got my video card and several Stable Diffusion models that are way better than the prices they charge.

        • @ante@lemmy.world
          link
          fedilink
          English
          159 months ago

          You’re not a business whose sole purpose is to sell/license images. If you read the article, it explains that their models are trained using only images from their library, which seems like a sensible approach to avoiding copyright issues.

          • P03 Locke
            link
            fedilink
            English
            79 months ago

            There’s no copyright issues to avoid. Stable Diffusion is not suddenly illegal based on the images it trains on. It is a 4GB database of weights and numbers, not a many petabyte database of images.

            Furthermore, Shutterstock cannot copyright their own AI-generated images, no matter how much they want to try to sell it back for. That’s already been decided in the courts. So, even if it’s their own images its trained on, if it was fully generated with their own AI, anybody is free to yank the image from their site and use it anywhere they want.

            This is a dying industry trying desperately to hold on to its profit model.

            • @TwilightVulpine@lemmy.world
              link
              fedilink
              English
              79 months ago

              Here we get the very crucial definition between “legal” and “moral”.

              It is not currently illegal to build a “database of weights and numbers” by crawling arts and images without permission, attribution or compensation, for the express purpose of creating similar works to replace the work of the artists whose artworks were used to train it and which they rely on to make a living.

              That doesn’t mean that it shouldn’t be legislated.

              Really not a fan of this “dying industry” talk in light of this.

              • @Even_Adder@lemmy.dbzer0.com
                link
                fedilink
                English
                3
                edit-2
                9 months ago

                It is morally right to be able to use others’ copyrighted material without permission. For analysis, criticism, research, satire, parody and artistic expression like literature, art, and music, In the US, fair use balances the interests of copyright holders with the public’s right to access and use information. There are rights people can maintain over their work, and the rights they do not maintain have always been to the benefit of self-expression and discussion. It would be awful for everyone if IP holders could take down any review, reverse engineering, or indexes they didn’t like. That would be the dream of every corporation, bully, troll, or wannabe autocrat. It really shouldn’t be legislated.

                AI training isn’t only for mega-corporations. After we’ve gone through and gutted all of our rights and protections like too many people want to do, we’ll have handed corporations a monopoly of a public technology by making it prohibitively expensive to for us to keep developing our own models. Mega corporations will still have all their datasets, and the money to buy more. They might just make users sign predatory ToS too, allowing them exclusive access to user data, effectively selling our own data back to us. People who could have had access to a corporate-independent tool for creativity, education, entertainment, and social mobility would instead be worse off with fewer resources and rights than they started with.

                I recommend reading this article by Kit Walsh, a senior staff attorney at the EFF if you haven’t already. The EFF is a digital rights group who most recently won a historic case: border guards now need a warrant to search your phone.

                You should also read this open letter by artists that have been using generative AI for years, some for decades. I’d like to hear your thoughts.

                • @TwilightVulpine@lemmy.world
                  link
                  fedilink
                  English
                  3
                  edit-2
                  9 months ago

                  I have read that article and I have found it sorely insufficient at addressing the concerns of the artists who are having to deal with this new situation. The EFF is usually great but I cannot agree with them on this stance.

                  You speak of “IP holders” and “corporations”, seemingly to give a connotation of overbearing nameless organizations to any attempt at legislation, but you don’t have a single word to say about the independent artists who are being driven out of their artistic careers by this. It doesn’t sound like you even considered what their side is like, just that you decided that it’s “morally right” to have free access to everyone’s works for AI training.

                  How fair is the “Fair Use” that lets artists get replaced by AI’s trained on their works? Way to often AI proponents argue of current legal definitions as if this was merely a matter of some philosophical mind games rather than people’s lives. The law exists to ensure people’s rights and well-being. It’s not sufficient for something to fit the letter of the law, if we want to judge it as just.

                  I did read this open letter, although I already wasn’t expecting much, and I can only find it sappy, shallow and disingenuous. They may say that they don’t care about using AI to replicate others’ works, not only that’s not sufficient to prevent it, it doesn’t address all the artists’ works that were still used without permission, attribution or compensation even if they use the resulting AI to produce works that don’t resemble any other work in particular.

                  We see a unique opportunity in this moment to shape generative AI’s development responsibly. The broad concerns around human artistic labor being voiced today cannot be ignored. All too often, major corporations and other powerful entities use technology in ways that exploit artists’ labor and undermine our ability to make a living.

                  But this has already failed. AI has already been developed and released irresponsibly. Corporations are already using it to exploit artists labor. Many major models are themselves an exploitation of artists’ labor. These are hollow words that don’t even suggest a way to address the matter.

                  There is only one thing I want to hear from AI advocates if they intend to justify it. Not legal wording or technical details or philosophical discussions about the nature of creativity, because ultimately they don’t address the material issues. Rather, how do they propose that the artists whose works they relied on ought to be supported. Because to scrape all their stuff and then to turn and say they are fated to be replaced, like many AI proponents do, is horribly callous, ungrateful and potentially more damaging to culture than any licensing requirement would be.

                  • @Even_Adder@lemmy.dbzer0.com
                    link
                    fedilink
                    English
                    39 months ago

                    You speak of “IP holders” and “corporations”, seemingly to give a connotation of overbearing nameless organizations to any attempt at legislation, but you don’t have a single word to say about the independent artists who are being driven out of their artistic careers by this. It doesn’t sound like you even considered what their side is like, just that you decided that it’s “morally right” to have free access to everyone’s works for AI training.

                    It is morally right to have to be able to use copyrighted material for whatever allows people to express themselves and enable the fair free flow of information. Artists are holders of IP in this case, but they are not corporations. Many seemingly want to go down the same path as abusive organizations like the RIAA. They seek to become abusers themselves and hobble people to keep them from participating certain conversations. That isn’t right.

                    How fair is the “Fair Use” that lets artists get replaced by AI’s trained on their works? Way to often AI proponents argue of current legal definitions as if this was merely a matter of some philosophical mind games rather than people’s lives. The law exists to ensure people’s rights and well-being. It’s not sufficient for something to fit the letter of the law, if we want to judge it as just.

                    AI aren’t people, they are a tool for people to use, and all people have a right to self-expression and that includes the training of AI. What some people want would give too much power over discourse to a few who have a financial and social incentive to be as controlling as possible. That kind of balance would be rife for abuse and would be catastrophic for everyone else. Like print media vs. internet publication and TV/Radio vs. online video, there will be winners and losers, but I think this will all be in service of a more inclusive, decentralized, and open media landscape.

                    I did read this open letter, although I already wasn’t expecting much, and I can only find it sappy, shallow and disingenuous. They may say that they don’t care about using AI to replicate others’ works, not only that’s not sufficient to prevent it, it doesn’t address all the artists’ works that were still used without permission, attribution or compensation even if they use the resulting AI to produce works that don’t resemble any other work in particular.

                    You simply don’t have to compensate someone so analyze public data. That would be like handing someone a flyer for lessons and then trying to collect a fee because they got good at the same kind of thing you do. They put in all the work, and they do new stuff that’s all their own.

                    We see a unique opportunity in this moment to shape generative AI’s development responsibly. The broad concerns around human artistic labor being voiced today cannot be ignored. All too often, major corporations and other powerful entities use technology in ways that exploit artists’ labor and undermine our ability to make a living.

                    But this has already failed. AI has already been developed and released irresponsibly. Corporations are already using it to exploit artists labor. Many major models are themselves an exploitation of artists’ labor. These are hollow words that don’t even suggest a way to address the matter.

                    There is only one thing I want to hear from AI advocates if they intend to justify it. Not legal wording or technical details or philosophical discussions about the nature of creativity, because ultimately they don’t address the material issues. Rather, how do they propose that the artists whose works they relied on ought to be supported. Because to scrape all their stuff and then to turn and say they are fated to be replaced, like many AI proponents do, is horribly callous, ungrateful and potentially more damaging to culture than any licensing requirement would be.

                    If I can’t use legal wording, technical details, or philosophy, how am I supposed to be able to explain? Your goal seems to be only to avoid or dismiss the complexity, nuance, or validity of any explanation. The best I can do is: It isn’t exploitation to analyze, reverse engineer, critique, or parody. It took us 100,000 years to get from cave drawings to Leonard Da Vinci. This is just another step, like Camera Obscura. We’re all standing on the shoulders of giants. We learn from each other and humanity is at its best when we can all share in our advancements. Calling this exploitation is self-serving manipulative rhetoric that unjustly vilifies people and misrepresents the reality of how these models work. And I never said anyone was fated to be replaced, you’re putting words in my mouth.

                    Generative AI is free and open source. There is a vibrant community of researchers, developers, activists, and artists who are working on FOSS software and models for anyone to use. There’s a worldwide network working for the public, often times leading research and development, for free. We’d like nothing more than to have more people join us, because together we are stronger.

                    I understand you’re passionate about this topic, and I respect your feelings. But, I think that you use manipulative language, personal attacks and misrepresenting of arguments to get out of giving any explanation. You have not provided any support or reasoning, and you ignored the ponts and facts I presented. The way you talk to people isn’t fair, and I don’t really feel like continuing this discussion, but thanks for listening.

            • @ante@lemmy.world
              link
              fedilink
              English
              1
              edit-2
              9 months ago

              I don’t get what your point is. Are you trying to generate images with Stable Diffusion and upload them to Shutterstock? Because that’s the only situation when the thing you’re complaining about applies. Nobody is stopping you from generating images and using them. What they are doing is preventing you from generating them and then trying to profit from them on the Shutterstock platform, unless you use their tools. Why is this an issue, in your opinion?

            • Patapon Enjoyer
              link
              fedilink
              English
              1
              edit-2
              9 months ago

              If that’s correct, then it’s even more understandable why they wouldn’t want an avalanche of pictures anyone can use for free on their service of selling pictures.

      • @Touching_Grass@lemmy.world
        link
        fedilink
        English
        8
        edit-2
        9 months ago

        More reason for Disney to just use AI generated art. I don’t see the point of artists anymore other than being in the way of creating things. Seems like all they do now is sue everyone and help create tools to limit everyone else

          • @Touching_Grass@lemmy.world
            link
            fedilink
            English
            6
            edit-2
            9 months ago

            Sure it does. People tell me all the time.

            Let me explain though, People create stuff. Artist create over priced same stuff but also sue you if you think about sharing it with anybody or creating your own. And the whole time they demand your attention by invading any cool space to busk. Like tipping culture, it invades everywhere.

            Eventually spaces that were collaberative and imaginative and unique are sued to oblivion and threatened with DMCA take downs so that this mediocre and costly mass produced stuff can be sold for 20x its value.

            If artist disappeared tomorrow, we would see a boom of content creation like never before. If we removed all the people trying to make their dollar in our spaces we would be left with actual creators not artists. We could chase the corporate social media hacks away. We could get back to a free internet when we remove all the people trying to capitalize on it.

            These greedy bottom feeders only get worse the more popular they get.

            Think of Justin bieber + psychosocial. Really Fun. Justin bieber or slipknot, not as fun. Try to find that mix on Spotify. You never will. And in a world of spotifies monopoly on online music we all lost the unique creative opportunity the internet provided because we all need to over pay for the artist nobody asked for. Anybody remember downloading crazy remixes on bearshare, how fun was that playing the audio file of 4 song smashed up and getting a truly awesome new song. Never again will we get that unique window of creation in our time.

            The internet was a refuge for people to get away from the over produced corporate crap and instead the artist brought them all here and censored and sued and threatened and put up paywall after paywall all to funnel us to their shitty fucking ad supported websites and pateron

            • Maven (famous)
              link
              fedilink
              English
              99 months ago

              You have to be trolling.

              There’s absolutely no way anything you just said here is remotely serious.

              Funny read tho, thanks for the chuckle.

        • Chaotic Entropy
          link
          fedilink
          English
          129 months ago

          I don’t see the point of artists anymore other than being in the way of creating things.

          … okay.

    • @BB69@lemmy.world
      link
      fedilink
      English
      239 months ago

      Because the corporation is ALWAYS at fault, duh. This is the internet, there’s only one way to look at things

        • @Nevoic@lemm.ee
          link
          fedilink
          English
          3
          edit-2
          9 months ago

          No way could this clusterfuck of IP (owning thoughts), the worry of AI “taking jobs” (e.g doing work that would otherwise be done by humans), and selling of the work on a marketplace at all be tied to the idea of capitalism.

          In other economic systems, having work automated would be a good thing, not an existential threat to the functioning of our entire global economy. I’m blown away that people don’t understand that.

    • P03 Locke
      link
      fedilink
      English
      39 months ago

      Why would they use a stock image of Loki? That already seems like its own copyright issue. Any image or likeness of a Disney character isn’t exactly “stock”.

      • @ante@lemmy.world
        link
        fedilink
        English
        139 months ago

        Read the fucking article, man. It’s not a stock image of a character, it’s the spiral clock background.

        • P03 Locke
          link
          fedilink
          English
          39 months ago

          I mean, besides the Roman numeral mistake and Shutterstock’s licensing rules, which is just a side conversation, what’s the backlash?

          Are we supposed to be immediately outraged when some artist uses some level of AI-generation when trying to create something? Is everybody going to be outraged when somebody uses Photoshop Generative Fill, or is that suddenly okay because it’s part of a commercial tool?

    • Shazbot
      link
      fedilink
      English
      29 months ago

      There’s one that comes to mind: registration of works with the Copyright Office. When submitting a body of work you need to ensure that you’ve got everything in order. This includes rights for models/actors, locations, and other media you pull from. Having AI mixed in may invalidate the whole submission. It’s cheaper to submit related work in bulk, a fair amount of Loki materials could be in limbo until the application is amended or resubmitted.

      • @Honytawk@lemmy.zip
        link
        fedilink
        English
        29 months ago

        AI collides with Copyright. The 2 systems don’t work together at all.

        Because if an image is generated, who “owns” it?

        • The person who wrote the prompt
        • The AI that generated the image
        • The researchers that developed the AI
        • The artists the AI is based upon

        It just doesn’t work. And AI is here to stay. So the only possible solution I see is that we revise the entire copyright system.

        Which is long overdue anyway. Disney has gotten away with too much already.

        • Shazbot
          link
          fedilink
          English
          29 months ago

          If we apply the current ruling of the US Copyright Office then the prompt writer cannot copyright if AI is the majority of the final product. AI itself is software and ineligible for copyright; we can debate sentience when we get there. The researchers are also out as they simply produce the tool–unless you’re keen on giving companies like Canon and Adobe spontaneous ownership of the media their equipment and software has created.

          As for the artists the AI output is based upon, we already have legal precedent for this situation. Sampling has been a common aspect of the music industry for decades now. Whenever an musician samples work from others they are required to get a license and pay royalties, by an agreed percentage/amount based on performance metrics. Photographers and film makers are also required to have releases (rights of a person’s image, the likeness of a building) and also pay royalties. Actors are also entitled to royalties by licensing out their likeness. This has been the framework that allowed artists to continue benefiting from their contributions as companies min-maxed markets.

          Hence Shutterstock’s terms for copyright on AI images is both building upon legal precedent, and could be the first step in getting AI work copyright protection: obtaining the rights to legally use the dataset. The second would be determining how to pay out royalties based on how the AI called and used images from the dataset. The system isn’t broken by any means, its the public’s misunderstanding of the system that makes the situation confusing.