OpenAI now tries to hide that ChatGPT was trained on copyrighted books, including J.K. Rowling’s Harry Potter series::A new research paper laid out ways in which AI developers should try and avoid showing LLMs have been trained on copyrighted material.

  • @zbyte64@lemmy.blahaj.zone
    link
    fedilink
    English
    101 year ago

    Ehh, “learning” is doing a lot of lifting. These models “learn” in a way that is foreign to most artists. And that’s ignoring the fact the humans are not capital. When we learn we aren’t building a form a capital; when models learn they are only building a form of capital.

    • @Tyler_Zoro@ttrpg.network
      link
      fedilink
      English
      81 year ago

      Artists, construction workers, administrative clerks, police and video game developers all develop their neural networks in the same way, a method simulated by ANNs.

      This is not, “foreign to most artists,” it’s just that most artists have no idea what the mechanism of learning is.

      The method by which you provide input to the network for training isn’t the same thing as learning.

      • @Sentau
        link
        English
        5
        edit-2
        10 months ago

        deleted by creator

        • @Yendor@reddthat.com
          link
          fedilink
          English
          21 year ago

          Do we know enough about how our brain functions and how neural networks functions to make this statement?

          Yes, we do. Take a university level course on ML if you want the long answer.

          • @Sentau
            link
            English
            1
            edit-2
            10 months ago

            deleted by creator

        • @Prager_U@lemmy.world
          link
          fedilink
          English
          11 year ago

          This is orthogonal to the topic at hand. How does the chemistry of biological synapses alone result in a different type of learned model that therefore requires different types of legal treatment?

          The overarching (and relevant) similarity between biological and artificial nets is the concept of connectionist distributed representations, and the projection of data onto lower dimensional manifolds. Whether the network achieves its final connectome through backpropagation or a more biologically plausible method is beside the point.

    • @Yendor@reddthat.com
      link
      fedilink
      English
      11 year ago

      When we learn we aren’t building a form a capital; when models learn they are only building a form of capital.

      What do you think education is? I went to university to acquire knowledge and train my skills so that I could later be paid for those skills. That was literally building my own human capital.