• circuitfarmer@lemmy.sdf.org
    link
    fedilink
    English
    arrow-up
    40
    ·
    1 year ago

    It’s arguably not good that we’re normalizing people being able to use this while its training relied on other creators who were not compensated.

      • Ech@lemm.ee
        link
        fedilink
        English
        arrow-up
        12
        ·
        1 year ago

        Humans using past work to improve, iterate, and further contribute themselves is not the same as a program throwing any and all art into the machine learning blender to regurgitate “art” whenever its button is pushed. Not only does it not add anything to the progress of art, it erases the identity of the past it consumed, all for the blind pursuit of profit.

        • Sethayy@sh.itjust.works
          link
          fedilink
          English
          arrow-up
          16
          ·
          1 year ago

          Oh yeah tell me who invented the word ‘regurgitate’ without googling it. Cause the its historical identity is important right?

          Or how bout who first created the internet?

          Its ok if you dont know, this is how humans work, on the backs of giants

          • Ech@lemm.ee
            link
            fedilink
            English
            arrow-up
            4
            ·
            1 year ago

            Me not knowing everything doesn’t mean it isn’t known or knowable. Also, there’s a difference between things naturally falling into obscurity over time and context being removed forcefully.

            • Sethayy@sh.itjust.works
              link
              fedilink
              English
              arrow-up
              5
              ·
              1 year ago

              And then there’s when its too difficult to upkeep them, exactly like how you can’t know everything.

              We probably ain’t gonna stop innovation, so we mine as well roll with it (especially when its doing a great job redistributing previously expensive assets)

              • Ech@lemm.ee
                link
                fedilink
                English
                arrow-up
                2
                ·
                1 year ago

                If it’s “too difficult” to manage, that may be a sign it shouldn’t just be let loose without critique. Also, innovation is not inherently good and “rolling with it” is just negligent.

                  • Ech@lemm.ee
                    link
                    fedilink
                    English
                    arrow-up
                    2
                    ·
                    1 year ago

                    Does meandering into other’s conversations and arbitrarily insulting people make you feel better about yourself?

      • acutfjg@feddit.nl
        link
        fedilink
        English
        arrow-up
        5
        ·
        1 year ago

        Were they in public forums and sites like stack overflow and GitHub where they wanted people to use and share their code?

        • ArmokGoB@lemmy.dbzer0.com
          link
          fedilink
          English
          arrow-up
          3
          ·
          edit-2
          1 year ago

          Stable Diffusion uses a dataset from Common Crawl, which pulled art from public websites that allowed them to do so. DeviantArt and ArtStation allowed this, without exception, until recently.

        • Echo Dot@feddit.uk
          link
          fedilink
          English
          arrow-up
          2
          ·
          1 year ago

          Where did the AI companies get their code from? Is scraped from the likes of stack overflow and GitHub.

          They don’t have the proprietary code that is used to run companies because it’s proprietary and it’s never been on a public forum available for download.

    • moon_matter@kbin.social
      link
      fedilink
      arrow-up
      25
      ·
      edit-2
      1 year ago

      Devil’s advocate. It means that only large companies will have AI, as they would be the only ones capable of paying such a large number of people. AI is going to come anyway except now the playing field is even more unfair since you’ve removed the ability for an individual to use the technology.

      Instituting these laws would just be the equivalent of companies pulling the ladder up behind them after taking the average artist’s work to use as training data.

      • Corkyskog@sh.itjust.works
        link
        fedilink
        English
        arrow-up
        1
        ·
        1 year ago

        How would you even go about determining what percentage belongs to the AI vs the training data? You could argue all of the royalties should go to the creators of the training data, meaning no one could afford to do it.

        • moon_matter@kbin.social
          link
          fedilink
          arrow-up
          1
          ·
          1 year ago

          How would you identify text or images generated by AI after they have been edited by a human? Even after that, how would you know what was used as the source for training data? People would simply avoid revealing any information and even if you did pass a law and solved all of those issues, it would still only affect the country in question.

    • mindbleach@sh.itjust.works
      link
      fedilink
      English
      arrow-up
      9
      ·
      1 year ago

      As distinct from human artists who pay dividends for every image they’ve seen, every idea they’ve heard, and every trend they’ve followed.

      The more this technology shovels into the big fat network of What Is Art, the less any single influence will show through.

    • Dkarma@lemmy.world
      link
      fedilink
      English
      arrow-up
      6
      ·
      1 year ago

      Literally the definition of greed. They dont deserve royalties for being an inspiration and moving a weight a fraction of a percentage in one direction…