Show transcript

Screenshot of a tumblr post by hbmmaster:

the framing of generative ai as “theft” in popular discourse has really set us back so far like not only should we not consider copyright infringement theft we shouldn’t even consider generative ai copyright infringement

who do you think benefits from redefining “theft” to include “making something indirectly derivative of something created by someone else”? because I can assure you it’s not artists

okay I’m going to mute this post, I’ll just say,

if your gut reaction to this is that you think this is a pro-ai post, that you think “not theft” means “not bad”, I want you to think very carefully about what exactly “theft” is to you and what it is about ai that you consider “stealing”.

do you also consider other derivative works to be “stealing”? (fanfiction, youtube poops, gifsets) if not, why not? what’s the difference? because if the difference is actually just “well it’s fine when a person does it” then you really should try to find a better way to articulate the problems you have with ai than just saying it’s “stealing from artists”.

I dislike ai too, I’m probably on your side. I just want people to stop shooting themselves in the foot by making anti-ai arguments that have broader anti-art implications. I believe in you. you can come up with a better argument than just calling it “theft”.

  • starman2112@sh.itjust.works
    link
    fedilink
    English
    arrow-up
    7
    ·
    1 day ago

    AI art generation isn’t theft, it’s the training that’s problematic. These companies are using artists’ work for free and without credit to generate massive amounts of profits, while simultaneously putting these artists out of work.

    While I’m on this soapbox, making AI art doesn’t make you an artist any more than commissioning an art piece does. There is literally no difference between telling the AI what you want it to draw, and telling a human what you want them to draw. You are not an artist, you are a client.

    • queermunist she/her@lemmy.ml
      link
      fedilink
      English
      arrow-up
      5
      ·
      edit-2
      23 hours ago

      While I’m on this soapbox, making AI art doesn’t make you an artist any more than commissioning an art piece does. There is literally no difference between telling the AI what you want it to draw, and telling a human what you want them to draw. You are not an artist, you are a client.

      Except humans are smart and can fill in the blanks of what you mean when you tell them to draw a picture. You don’t need any skill because the artist is skilled.

      The slop generators are dumb af, massaging them to produce good results is definitely a skill. They aren’t good enough to fill in the blanks like a human artist and it’s up to the prompt generator to convince it to draw something that doesn’t look like shit.

      • starman2112@sh.itjust.works
        link
        fedilink
        English
        arrow-up
        5
        ·
        1 day ago

        I will concede that you can be a skilled prompt generator. It is still not the same thing as producing art yourself.

        • queermunist she/her@lemmy.ml
          link
          fedilink
          English
          arrow-up
          4
          ·
          1 day ago

          I think it’s art-adjacent, what we should find is that actual artists are going to be better at generating good prompts.

          • Comment105@lemm.ee
            link
            fedilink
            English
            arrow-up
            2
            ·
            edit-2
            24 hours ago

            I don’t believe that last part is necessarily true. I looked into prompt wizardry a bit and it gets oddly complicated. Like trying to convince a monkey’s paw to actually fulfill your real wish with no bullshit.

            It didn’t seem like the sort of stuff I or better artists I know would pick up naturally.

  • kazerniel@lemmy.world
    link
    fedilink
    English
    arrow-up
    10
    ·
    1 day ago

    The theft was the scraping and regurgitating of art that then puts the original artists out of work.

  • glitchdx@lemmy.world
    link
    fedilink
    English
    arrow-up
    12
    ·
    1 day ago

    Holding private individuals to the same standards as megacorporations doesn’t make sense, nor does the reverse.

    Megacorporations must be held to stricter standards because of the wealth and power they wield vs ordinary people.

    It is not theft when I download a copy of a game that cannot be purchased anymore.

    It is theft when openai dumps my novel notes into “the pile” to train their ais for monetary gain and without my permission.

  • Nat (she/they)@lemmy.blahaj.zone
    link
    fedilink
    English
    arrow-up
    10
    ·
    1 day ago

    Capitalists found a way to exploit artists even harder, so that now they don’t even need to pay them.

    I don’t think people would care quite as much if gen AI merely existed (I’m sure many would still dislike it, but just for being soulless). But it doesn’t just do that, it also destroys artists’ livelihoods and prevalence of their art using their own work. I don’t really care if it’s technically theft or not, it’s doing bad for society regardless.

  • Tar_Alcaran@sh.itjust.works
    link
    fedilink
    English
    arrow-up
    37
    ·
    2 days ago

    The premise here is wrong.

    The theft isn’t “ai made a derivative work”. The theft is “human ai bros scraped all the stuff I made, without permission or compensation, and used it as training data”.

    The problem is that art is being used for purposes the artist explicitly disagrees with. Imagine your artwork as a backdrop for company that steals candy from babies to feed elephant poachers. In a normal world, you can at least sue that company to take it down.

    But when OpenAI uses your artwork to pump thousands of tons of CO2 into the air, you can’t do shit, and according to OP, you shouldn’t even complain about your work being taken.

    • BussyGyatt@feddit.org
      link
      fedilink
      English
      arrow-up
      4
      ·
      edit-2
      1 day ago

      op’s complaint is that “ai is theft/copyright infringement” is a fundamentally different argument than “ai is energy-intensive/pollutive” and that if youre upset about the co2 then argue against the co2; if youre upset about copyright infrengement, why are other derivative works valid. it was in the op actually.

      • Tar_Alcaran@sh.itjust.works
        link
        fedilink
        English
        arrow-up
        6
        ·
        1 day ago

        No, the problem is “my work is being used for something I don’t want, and didn’t agree too”. That could be pollution, it could be people who dislike vowels. Artists get a say in the use of their work, and AI bros break those laws.

        • BussyGyatt@feddit.org
          link
          fedilink
          English
          arrow-up
          1
          ·
          1 day ago

          okay if the problem is “the thing i made is being used in a way i dont like,” that’s still separate from theft/copyright infringement.

  • SoftestSapphic@lemmy.world
    link
    fedilink
    English
    arrow-up
    2
    ·
    edit-2
    23 hours ago

    AI people intentionally misunderstanding how the tech works but defending it anyway lets me know that nobody smart actually supports this garbage.

    If it needs the training data to make the output then it is copying the training data

    And if the training data was stolen that means AI is theft.

    FFS

  • wizzor@sopuli.xyz
    link
    fedilink
    English
    arrow-up
    12
    ·
    2 days ago

    This is interesting. I agree that stealing isn’t the right category. Copyright infringement may be, but there needs to be a more specific question we are exploring.

    Is it acceptable to make programmatic transformations of copyrighted source material without the copyright holder’s permission for your own work?

    Is it acceptable to build a product which contains the copyrighted works of others without their permission? Is it different if the works contained in the product are programmatically transformed prior to distribution?

    Should the copyright holders be compensated for this? Is their permission necessary?

    The same questions apply to the use of someone’s voice or likeness in products or works.

    • Zetta@mander.xyz
      link
      fedilink
      English
      arrow-up
      3
      ·
      1 day ago

      Is it acceptable to build a product which contains the copyrighted works of others without their permission? Is it different if the works contained in the product are programmatically transformed prior to distribution?

      Somebody correct me if I’m wrong, but my understanding of how image generation models and training them works is that the end product, in fact, does not contain any copyrighted material or any transformation of that copyrighted material. The training process refines a set of numbers in the model, But those numbers can’t really be considered a transformation of the input.

      To preface what I’m about to say, LLMs and image models are absolutely not intelligent, and it’s fucking stupid that they’re called AI at all. However, if you look at somebody’s art and learn from it, you don’t contain a copyrighted piece of their work in your head or a transformation of that copyrighted work. You’ve just refined your internal computers knowledge and understanding of the work, I believe the way image models are trained could be compared to that.

      • burgerpocalyse@lemmy.world
        link
        fedilink
        English
        arrow-up
        4
        ·
        1 day ago

        the generated product absolutely contains elements of the things it copied from. imagine the difference between someone making a piece of art that is heavily inspired by someone else’s work VS directly tracing the original and passing it off as entirely yours

        • Zetta@mander.xyz
          link
          fedilink
          English
          arrow-up
          1
          ·
          edit-2
          22 hours ago

          I understand that’s how you think of it, but I’m talking about the technology itself. There is absolutely no copy of the original work, in the sense of ones and zeros.

          The image generation model itself does not contain any data at all that is any of the work it was trained on, so the output of the model can’t be considered copyrighted work.

          Yes, you can train models to copy artists’ styles or work, but it’s not like tracing the image at all. Your comparison is completely wrong. It is a completely unique image that is generated off of the model itself, because the model itself does not contain any of the original work.

      • FatCrab
        link
        fedilink
        English
        arrow-up
        1
        ·
        22 hours ago

        This is generally correct, though diffusion models and GPTs work in totally different ways. Assuming an entity had lawful access to the image in the first place, nothing that persists in a trained diffusion model can be realistically considered to be a copy of any particular training image by anyone who knows wtf they’re talking about.

    • mindbleach@sh.itjust.works
      link
      fedilink
      English
      arrow-up
      1
      ·
      1 day ago

      The magic word here is transformative. If your use of source material is minimal and distinct, that’s fair use.

      If a 4 GB model contains the billion works it was trained on - it contains four bytes of each.

      What the model does can be wildly different from any particular input.

  • tyler@programming.dev
    link
    fedilink
    English
    arrow-up
    17
    ·
    2 days ago

    “It’s fine if a person does it” is a fantastic argument. Saying that it’s ok to allow robots to continue to replace every part of human life will only lead to suffering for literally everything in existence. Computers can destroy and create in milliseconds what might take humans a lifetime to achieve. If this isn’t an incredibly good reason to regulate the shit out of ai then what is?!?!?

    Like yes, currently generative AI use is incredibly difficult to get something non-derivative, e.g. using it as a tool like Photoshop. But that most definitely will not be the case in a few years. This is by far the steepest, slipperiest, most ridiculous slope we could be on and it’s not even close.

    This is the biggest problem with technology, regulation is reactionary and not preemptory. Not taking action immediately has already gotten earth into a ridiculously bad situation. If we continue to allow it it’s only going to get worse and harder to undo.

  • Susaga@sh.itjust.works
    link
    fedilink
    English
    arrow-up
    32
    ·
    edit-2
    2 days ago

    AI images try to replicate the style of popular artists by using their work, often including work that was behind a paywall and taken without payment, thus denying the artists revenue. AI has taken something from the artist, and cost the artist money. Until such a time as we come up with a new word for this new crime, we’ll call it by the closest equivalent: theft.

    Also, someone did an experiment and typed “movie screenshot” into an AI and it came back with a nearly identical image from Endgame. Not transformative enough to be anything but copyright infringement.

    • itslilith@lemmy.blahaj.zone
      link
      fedilink
      English
      arrow-up
      11
      ·
      2 days ago

      Defining “taken something behind a paywall and thus denied them revenue” as theft is the exact same argument movie studios make when you pirate a movie. Theft implies that the original is gone. If I steal your car, you don’t have a car. If I pirate a movie, we both have that movie. As someone who supports piracy, I would be careful to conflate piracy with theft. I think that’s the entire point the post is making.

      Fuck AI slop. There’s enough other arguments against it. It destroys the environment and artists’ livelihoods. We can point that out without supporting corporate copyright talking points.

    • unhrpetby@sh.itjust.works
      link
      fedilink
      English
      arrow-up
      2
      ·
      edit-2
      1 day ago

      …thus denying the artists revenue.

      This assumes they would otherwise pay for it, and that they measurably harmed the artist’s revenue. Those aren’t a given.

      Until such a time as we come up with a new word…

      Use of copyrighted material without permission and possible deprivation of revenue. It doesn’t need to be a single word.

    • PlzGivHugs@sh.itjust.works
      link
      fedilink
      English
      arrow-up
      5
      ·
      edit-2
      2 days ago

      AI images try to replicate the style of popular artists by using their work, often including work that was behind a paywall and taken without payment, thus denying the artists revenue. AI has taken something from the artist, and cost the artist money. Until such a time as we come up with a new word for this new crime, we’ll call it by the closest equivalent: theft.

      I’d argue it’s much closer to piracy or freebooting. Generally, its use doesn’t hurt artists, seeing as a random user isn’t going to spend hundreds or thousands to hire a talented artist to create shitposts for them. Doesn’t necessary make it okay, but it also doesn’t directly hurt anyone. In cases of significant commercial use, or copyright infringement, I’d argue its closer to freebooting: copying another’s work, and using it for revenue without technically directly damaging the original. Both of these are crimes, but both are more directly comparable and less severe than actual theft, seeing as the artist loses nothing.

      Also, someone did an experiment and typed “movie screenshot” into an AI and it came back with a nearly identical image from Endgame. Not transformative enough to be anything but copyright infringement.

      Copyrighted material is fed into an AI as part of how it works. This doesn’t mean than anything that comes out of it is or is not copyrighted. Copyrighted matterial is also used in Photoshop, for example, but as long as you don’t use Photoshop to infringe on somsone else’s copyright, there isn’t anything intrinsically wrong with Photoshop’s output.

      Now, if your compaint is that much of the training data is pirated or infringes on the licensing its released under, thats another matter. Endgame isn’t a great example, given that it can likely be bought with standard copyright limitations, and ignoring that, its entirely possible Disney has been paid for their data. We do know huge amounts of smaller artists have had their work pirated to train AI, though, and because of the broken nature of our copyright system, they have no recourse - not through the fault of AI, but corrupt, protectionist governments.

      All that said, theres still plenty of reasons to hate AI (and esspecially AI companies) but I don’t think the derivative nature of the work is the primary issue. Not when they’re burning down the planet, flooding our media with propaganda, and bribing goverments, just to create derivative, acceptable-at-best “”“art”“”. Saying AI is the problem is an oversimplification - we can’t just ban AI to solve this. Instead, we need to address the problematic nature of our copyright laws, legal system, and governments.

      • Susaga@sh.itjust.works
        link
        fedilink
        English
        arrow-up
        2
        ·
        2 days ago

        No, it is theft. They use an artist’s work to make an image they would otherwise pay the artist to make (a worse version, but still). And given how I’ve seen an image with a deformed patreon logo in the corner, they didn’t pay what they should have for the images. They stole a commission.

        And it is copyright violation. There have been successful lawsuits over much less than a direct image of RDJ in the iron man suit with the infinity stones on his hand. And if they won’t pay an artist’s rates, there’s no way they’d pay whatever Disney would charge them

        Yes, there’s a lot of problems with AI. And yes, AI is a part of larger issues. That doesn’t mean theft isn’t also an issue with AI.

        AI is a nazi-built, kitten blood-powered puppy kicking machine built from stolen ambulance parts. Even if stealing those ambulance parts is a lesser sin than killing those kittens, it’s still a problem that needs to be fixed. Of course, AI will never be good, so we need to get rid of the whole damn thing.

        • PlzGivHugs@sh.itjust.works
          link
          fedilink
          English
          arrow-up
          1
          ·
          1 day ago

          No, it is theft. They use an artist’s work to make an image they would otherwise pay the artist to make (a worse version, but still). And given how I’ve seen an image with a deformed patreon logo in the corner, they didn’t pay what they should have for the images. They stole a commission.

          But were they (the AI users) going to pay for the content? I have never paid for a Patreon, given that I don’t really have any disposable income. Why would I start, just because AI exists? Just because a sale may be made in some contexts, doesn’t mean it has been made.

          And it is copyright violation. There have been successful lawsuits over much less than a direct image of RDJ in the iron man suit with the infinity stones on his hand.

          Its a copyright violation when material is made that violates existing copyright. It isn’t copyright infringement to take data from media, or to create derivative works.

          And if they won’t pay an artist’s rates, there’s no way they’d pay whatever Disney would charge them

          Disney has lawers. Small artists don’t.

          AI is a nazi-built, kitten blood-powered puppy kicking machine built from stolen ambulance parts. Even if stealing those ambulance parts is a lesser sin than killing those kittens, it’s still a problem that needs to be fixed. Of course, AI will never be good, so we need to get rid of the whole damn thing.

          Banning AI doesn’t stop the Nazis from running the government or influencing the populus, it doesn’t stop them burning the planet, it doesn’t stop them from pirating work and otherwise exploiting artists. Hell, politicians have been doing all of these things without repercussions for a century. If you want the rich and powerful to stop pirating and freebooting artist’s work, maybe the first step is to ban that (or rather, enforce it) rather than a technology two steps removed?

          • _g_be@lemmy.world
            link
            fedilink
            English
            arrow-up
            5
            ·
            edit-2
            1 day ago

            But we’re AI users going to pay?

            In your head is AI being used solely by common people for fun little prompts? If you build this machine that replaces the artist, corporations can and will use it that way.

            Big movie studios will use it to generate parts (and eventually all) of a movie. They can use this as leverage to pay the artists less and hire fewer of them. Animators, actors, voice actors.

            you want the rich and powerful to stop pirating and freebooting artist’s work, maybe the first step is to ban that (or rather, enforce it) rather than a technology two steps removed?

            If a movie studio pirated work and used it in a film, that’s against copyright and we could sue them under current law.
            But if they are paying openAI for a service, and it uses copyrighted material, since openAI did the stealing and not the studio then it’s not clear if we can sue the studio.

            Logically we would pursue openAI then, but you’re arguing that we shouldn’t because it’s “two steps removed”.

            Seems like it’s being argued that because of the layer of abstraction that is created when large quantities of media is used, rather than an individual’s work, that it’s suddenly a victimless crime. That because what’s being done is not currently illegal it must not be immoral either.

            • PlzGivHugs@sh.itjust.works
              link
              fedilink
              English
              arrow-up
              1
              ·
              edit-2
              1 day ago

              Big movie studios will use it to generate parts (and eventually all) of a movie. They can use this as leverage to pay the artists less and hire fewer of them. Animators, actors, voice actors.

              Only if its profitable, and given that AI output is inherently very limited, it won’t be. AI can only produce lower quality, derivative works. In isolation, some works might not be easy to distinguish, but thats only on a small scale and in isolation.

              If a movie studio pirated work and used it in a film, that’s against copyright and we could sue them under current law.
              But if they are paying openAI for a service, and it uses copyrighted material, since openAI did the stealing and not the studio then it’s not clear if we can sue the studio.

              You can sue the studio. In the same way, you would sue the studio if an artist working there (or even someone directing artists) creates something the violates copyright, even by accedent. If they publish a work that infringes on copyright, you can sue them.

              Seems like it’s being argued that because of the layer of abstraction that is created when large quantities of media is used, rather than an individual’s work, that it’s suddenly a victimless crime.

              By that logic, anything that takes inspiration, no matter now broad, or uses anothers work in any way, no matter how transformative, should be prevented from making their own work. That is my point. AI is just an algorithm to take thousands of images and blends them together. It isn’t evil, any more than a paint brush is. What is, is piracy for commercial use, and non-transformative copyright infringement. Both of these are already illegal, but artists can’t do anything about it, not because companies haven’t broken the law, but rather because an independent author trying to take, for example, Meta to court is going to bankrupt themselves.

              Edit: Also notable in companies using/not using AI, is the fact that even transformative and “”“original”“” AI work cannot be copyrighted. If Disney makes a movie thats largely AI, we can just share it freely without paying them.

              • _g_be@lemmy.world
                link
                fedilink
                English
                arrow-up
                3
                ·
                1 day ago

                Only if its profitable, and given that Al output is inherently very limited, it won’t be. Al can only produce lower quality

                This is literally already happening. The SAGAFTRA screen actors guild had to negotiate new contracts against studios that were using AI as a bargaining chip to lower their wages

                • PlzGivHugs@sh.itjust.works
                  link
                  fedilink
                  English
                  arrow-up
                  1
                  ·
                  edit-2
                  1 day ago

                  It isn’t current AI voice tech that was an issue. It was the potential for future AI they were worried about. AI voices as they are now, are of similar quality to pulling someone off the street and putting them in front of a mid-range mic. If you care about quality at all, (without massive changes to how AI tech functions) you’ll always need a human.

                  And to be clear, what about AI makes it the problem, rather than copyright? If I can use a voice synthesizer to replicate an actors voice, why is that fine and AI not? Should it not be that reproduction of an actor’s voice is right or wrong based on why its done and its implications rather than because of the technology used to replicate it?

                  Edit: And to be clear, just because a company can use it as an excuse to lower wages, doesn’t mean its a viable alternative to hiring workers. Claims that they could replace their workers with AI is just the usual capitalist bullshit excuses to exploit their workers.

  • Endymion_Mallorn@kbin.melroy.org
    link
    fedilink
    arrow-up
    14
    ·
    2 days ago

    Yep. I really hope that the conversation around LLMs moves away from words like “theft”. Show me evidence of an artwork that an LLM has concealed artwork from the public. I’ve looked, and all that I’ve found is that they make more media more accessible. That’s not theft. That’s piracy. That’s culture-jamming.

    So if you want to call it appropriation, fine. It’s classic EEE methods, applied to Gonzo, Dada & Punk ideas. Embrace - taking in everything and culture-jamming with meaningless text & images (painfully Dada). Extend - by turning this into both a toy and a corpo “tool”, they extended Dada into programming, articles, news media, you name it. Extinguish - when everything is a punk remix, or everything is meaningless Dada, nothing is. Therefore the true punks will be the classicist, reconstructionists, the Bible-beaters, and their ilk. And then Punk is dead.

    • julietOscarEcho@sh.itjust.works
      link
      fedilink
      English
      arrow-up
      8
      ·
      2 days ago

      “AI slop is punk” has the ring of when brew dog claimed they were doing a “punk” equity offering. 😂

      Vibe coding is dadaist. If you say so, but I figure intent matters…

      • Endymion_Mallorn@kbin.melroy.org
        link
        fedilink
        arrow-up
        3
        ·
        1 day ago

        Intent matters from the individuals who built the model and trained it. Not from those being jammed. Vibe coders are just tools the artist and punk use to advance the work and the impact.

      • LH0ezVT@sh.itjust.works
        link
        fedilink
        English
        arrow-up
        3
        ·
        2 days ago

        Intention matters or even, intention is all that matters here. Otherwise a random number generator is dadaist.

        • Endymion_Mallorn@kbin.melroy.org
          link
          fedilink
          arrow-up
          2
          ·
          1 day ago

          RNG only for the sake of RNG can be said to be Dada. The appearance or simulation of intent without practical purposes fits the aesthetic well. The moment it actually serves a practical purpose, it ceases to be. TempleOS’ Oracle and the ‘particle effect’ in Super Mario World are not Dada, but just outputting dev/random could be.

          Simulation has merged with the thing being simulated and the simulacra take the originals’ places. Nonsense has become the only sense.

    • Valmond@lemmy.world
      link
      fedilink
      English
      arrow-up
      2
      ·
      1 day ago

      The world needs more people like you, and I would love hearing more about all this from you 💝!

  • einlander@lemmy.world
    link
    fedilink
    English
    arrow-up
    23
    ·
    2 days ago

    Yeah, I don’t agree. Unfortunately I’m not articulate enough to explain why I feel this way. I feel like they are glossing over things. How would you describe corporations willfully taking art/data/content form others without any permission, attribution, or payment and creating a tool with said information for the end goal of making profits by leveraging the work of others into a derivative work that completes with the original?(Holy run on sentence) If there is a better word or term than theft for what generative ai does then they should use it instead.

    • rumschlumpel@feddit.org
      link
      fedilink
      English
      arrow-up
      12
      ·
      edit-2
      2 days ago

      It’s basically for-profit piracy. Which is still kind of a shitty term because actual pirates weren’t copying any of the goods they were taking.

      The most neutral term might be copyright infringement, though that carries all the baggage of the ‘should copyright even exist’-discussion.

      Alternatively, you could shout ‘they took our jobs’ to complain that they are letting algorithms and engineers do the work that artists want to do. IDK what to call this, but ‘theft’ or ‘robbery’ doesn’t sound right.

      • valaramech@fedia.io
        link
        fedilink
        arrow-up
        6
        ·
        2 days ago

        I think the biggest problem is that the idea of copyright is good, but the implementation - in most places, anyways - is complete dogshit.

        Like, I’m fairly certain the original implementation of copyright in the US only lasted 10 years or thereabouts. Like, that’s more than enough time to profit off whatever you made but short enough that it’ll be usable by others within their lifetimes. This whole “life of the author + 100 years” shit needs to die.

    • RedDragonArchfiend@sh.itjust.works
      link
      fedilink
      English
      arrow-up
      1
      ·
      edit-2
      1 day ago

      I guess what feels off to me is that the generative AI itself does nothing of the sort; the corporations creating the product AI models do. There are already attempts to make generative AI models that are trained exclusively on data that was licensed for it. I imagine some people would still like to push regulation against companies producing those models, though I am not one of them. I’d like to decouple the arguments of “this use of technology is bad, because it (devalues human works / takes away jobs / …)” from “the corporations train their generative AI models in an unethical manner”.

    • Chloé 🥕@lemmy.blahaj.zoneOP
      link
      fedilink
      English
      arrow-up
      5
      ·
      2 days ago

      i get what you mean, but at the same time, i feel like there’s not much you can do with AI that you can’t do without it (at least, in terms of art), it just takes more time and you’d most likely need to pay someone to do it. so in this case AI would take work opportunities away from people, that’s bad, but that’s not copyright infringement nor theft.

      and i’m worried that by propagating the idea that training an AI model is theft, that it could lead to even more regulation on copyright that would just end up hurting smaller artists in the end. like, most people agree that making AI art in the style of Studio Ghibli movies is scummy, but would an indie artist making art in that style be wrong too? you may think not, but if it becomes a crime to emulate an art style using AI, it takes very little extrapolation to make it a crime to emulate an art style without AI. and do i need to say why being able to copyright an art style would be awful?

      • rumschlumpel@feddit.org
        link
        fedilink
        English
        arrow-up
        5
        ·
        edit-2
        2 days ago

        so in this case AI would take work opportunities away from people, that’s bad, but that’s not copyright infringement nor theft.

        I think it’s quite literally copyright infringement, assuming the models are fed with work from actual artists who typically don’t agree to it. Whether copyright should work this way is another matter.

        • Chloé 🥕@lemmy.blahaj.zoneOP
          link
          fedilink
          English
          arrow-up
          6
          ·
          edit-2
          2 days ago

          I meant it more in a “hire an artist to work on art for you VS just ask the AI to do it instead” way

          even if you’re emulating an art style in particular, that’s not copyright infringement because you can’t copyright an art style. which is good because if you could, that would be awful for a ton of artists

          it’s only copyright infringement if you ask an AI to do, say, a picture of mario. but in this case, it’s also copyright infringement if you commission an artist to do it!

          • rumschlumpel@feddit.org
            link
            fedilink
            English
            arrow-up
            6
            ·
            edit-2
            2 days ago

            I suppose the issue is whether you want to see training an AI as equivalent to practicing as a human artist. Considering that AIs are generally made specifically as commercial products, I don’t think that’s really true, but this is definitely something that can be argued one way or the other.

            I don’t think it would be considered fine under current laws if an AI-free Adobe Photoshop was shipped with tons of copyrighted art that was scraped off the internet without the artists’ approval, even if the users aren’t allowed to use it to make commercial works that reproduce Super Mario or w/e 1:1.

          • nickwitha_k (he/him)@lemmy.sdf.org
            link
            fedilink
            English
            arrow-up
            2
            ·
            2 days ago

            it’s only copyright infringement if you ask an AI to do, say, a picture of mario. but in this case, it’s also copyright infringement if you commission an artist to do it!

            Nah.

            1. Training a statistical model with unlicensed work is not the same as a human learning.

            2. Under copyright/IP laws, using a copyrighted work, without license, with the intent of competing with the copyright holder (what virtually all commercial AI models are doing) is not fair use and there is plenty of case law backing that. Whether something is transformative (arguably, training models isn’t) doesn’t even matter if the infringement is done with the intent of causing material harm to the copyright holder through competition. None of the models out there fit the criteria for fair use.

            3. Something being a tool does not magically remove all liability. This is especially true if the tool is built illegally using unlicensed intellectual property and depends on said unlicensed intellectual property to have any value (literally all major models fit this description).

  • Kernal64@sh.itjust.works
    link
    fedilink
    English
    arrow-up
    15
    ·
    2 days ago

    “I believe in you. You can come up with a better argument than just theft.”

    Nah, fuck that shit. It OOP feels so strongly that it’s not theft and they wanna change how the population at large is referring to something, then it’s on them to provide an alternative and convince others. This weird ass attempt to shame people into doing things their way, especially when they haven’t really defined what they consider their way, is absolute horse shit.

    This whole post is full of this. The OOP tries to completely remove intent and method from the analysis of whether something is art theft. Those things absolutely factor into it and they’re only discounting them in order to push their weird narrative.

    AI scrapping tons of work belonging to artists and then regurgitating that as original work is fucking gross, no matter what you call it. Theft seems fine to me, but I am open to calling it something else. Unfortunately OOP won’t be the I’ve to convince me since they neither provide reasoning for why calling it theft is bad or what we should call it instead and why.

    • Endymion_Mallorn@kbin.melroy.org
      link
      fedilink
      arrow-up
      5
      ·
      2 days ago

      The alternative is to call out copywrong in every version and every facet of existence. This isn’t theft, it’s duplication. The argument is simple: LLMs are the new printing press.

  • Rentlar@lemmy.ca
    link
    fedilink
    English
    arrow-up
    6
    ·
    edit-2
    2 days ago

    I agree with the comment here that AI image generation is more like piracy in that you are appropriating other artists’ works without their permission.

    So I mean personally I agree that AI art is soulless and possibly copyright infringement but not theft, whereas for people that consider piracy theft, calling AI art “theft” is not an inconsistent or hypocritical argument, in my opinion. Machine or human doesn’t make it stealing or not.

    I mean even Zuck’s Meta is claiming in court that they aren’t stealing books when torrenting massive amounts of them for AI training, lol, so he’s consistent there. But Nintendo on the other hand are probably seething at AI Nintendo art being “stolen” from them.

  • SqueakyBeaver@lemmy.blahaj.zone
    link
    fedilink
    English
    arrow-up
    7
    ·
    2 days ago

    Disclaimer: I’ve not workshopped this much, so idk if these are the right words to convey how I feel

    I feel like using AI to generate images is akin to taking someone’s art and applying a light gaussian blur to it or putting an uncredited artist’s work in a big game.

    I know it’s done in a much more intricate way, and it’s genuinely impressive how AI companies got it to work so well, but if I try to sell AI generated images, especially if they’re meant to be made similar to an artist’s work, then that’s all I’m doing.

    I don’t necessarily see it as stealing from artists (though it is threatening the livelihood of a lot of artists), but more as exploiting artists but with a new buzzword.

    If I arrange 4 pieces of art in a jpeg and then apply a whacky filter, am I actually creating anything, or am I just exploiting artists and doing something similar to copying and pasting different bits of an essay and then changing every instance of a word to a different synonym?

    I believe AI does something similar to that, albeit in a more sophisticated way that looks like creativity.