Greg Rutkowski, a digital artist known for his surreal style, opposes AI art but his name and style have been frequently used by AI art generators without his consent. In response, Stable Diffusion removed his work from their dataset in version 2.0. However, the community has now created a tool to emulate Rutkowski’s style against his wishes using a LoRA model. While some argue this is unethical, others justify it since Rutkowski’s art has already been widely used in Stable Diffusion 1.5. The debate highlights the blurry line between innovation and infringement in the emerging field of AI art.

  • FaceDeer@kbin.social
    link
    fedilink
    arrow-up
    7
    ·
    1 year ago

    The speed doesn’t factor into it. Modern machines can stamp out metal parts vastly faster than blacksmiths with a hammer and anvil can, are those machines doing something wrong?

    • Pulse@dormi.zone
      link
      fedilink
      arrow-up
      2
      ·
      1 year ago

      The machine didn’t take the blacksmiths work product and flood the market with copies.

      The machine wasn’t fed 10,000 blacksmith made hammers then told to, sorta, copy those.

      Justify this all you want, throw all the bad analogies at it you want, it’s still bad.

      Again, if this wasn’t bad, the companies would have asked for permission. They didn’t.

      • FaceDeer@kbin.social
        link
        fedilink
        arrow-up
        4
        ·
        1 year ago

        That’s not the aspect you were arguing about in the comment I’m responding to. You said:

        You keep comparing what one person, given MONTHS or YEARS of their life could do with one artists work to a machine doing NOT THE SAME THING can do with thousands of artists work.

        And that’s what I’m talking about here. The speed with which the machine does its work is immaterial.

        Though frankly, if the machine stamping out parts had somehow “learned” how to do it by looking at thousands of existing parts, that would be fine too. So I don’t see any problem here.

        • Pulse@dormi.zone
          link
          fedilink
          arrow-up
          1
          ·
          edit-2
          1 year ago

          And that’s where we have a fundamental difference of opinion.

          A company hiring an engineer to design a machine that makes hammers, then hiring one (or more) people to make the machine to then make hammers is the company benefiting from the work product of people they hired. While this may impact the blacksmith they did not steal from the blacksmith.

          A company taking someone else’s work product to then build their product, without compensation or consent, is theft of labor.

          I don’t see those as equitable situations.

          • FaceDeer@kbin.social
            link
            fedilink
            arrow-up
            5
            ·
            1 year ago

            At least now you’re admitting that it’s a difference of opinion, that’s progress.

            You think it should be illegal to do this stuff. Fine. I think copyright duration has been extended ridiculously long and should be a flat 30 years at most. But in both cases our opinions differ from what the law actually says. Right now there’s nothing illegal about training an AI off of someone’s lawfully-obtained published work, which is what was done here.

            • Pulse@dormi.zone
              link
              fedilink
              arrow-up
              1
              ·
              1 year ago

              I’m not a fan of our copyright system. IMO, it’s far to long and should also include clauses that place anything not available for (easy) access in the public domain.

              Also, I’m not talking about what laws say, should say or anything like that.

              I’ve just been sharing my opinion that it’s unethical and I’ve not seen any good explanation for how stealing someone else’s labor is “good”.