Who’d of thunk it :)

  • @KoboldCoterie@pawb.social
    link
    fedilink
    English
    131 year ago

    This would actually be really interesting to observe. AI training on AI-generated content that was trained on AI generated content, etc. - and see how quickly the output breaks down and in what ways.

  • Stefen AurisM
    link
    fedilink
    English
    81 year ago

    Recursion is fun. Recursion is fun. Recursion is fun. Recursion is fun.

  • Faoluin
    link
    fedilink
    English
    81 year ago

    If predictive models can’t survive without someone else to constantly give them input, does that make them digital parasites?

  • @darq@pawb.social
    link
    fedilink
    English
    7
    edit-2
    1 year ago

    Entirely predictable. AI will put its own sources out of business, and we will be left poorer for it.

    But so long as someone can make a buck, they will burn the world down for it.

  • shackled
    link
    fedilink
    English
    51 year ago

    Hilarious. I don’t see any of the big companies doing anything to fix this issue anytime soon.

  • Southern Wolf
    link
    fedilink
    English
    31 year ago

    Thing is, this isn’t really how AI training works and it can be easily done on the outputs of other AI. That’s actually what Standford used to train their (comparably) small LLM that was very competent, despite its size. It was trained on the outputs of GPT (iirc) and held it’s own much better than other models in a similar category, which is also what opened up the doors to smaller, more specialized models being useful, rather than giant ones like GPT.

    Now, image generation via diffusion might be more troublesome, but that’s fairly easily mitigated through several means, including a human or automated discriminator, which basically becomes a pseudo form of a GAN. There’s also other processes that exist for this that aren’t as affected (from what I know at least), such as GANs. But given most image AI’s are trained on stuff like LAION, AI images being uploaded online will have no effect on that, not for quite a while at least, if ever.

    • @ParsnipWitch@feddit.de
      link
      fedilink
      English
      11 year ago

      This prediction is based upon AI being trained on exclusively AI content for a long time. There is no example for that yet.