This is peak laziness. It seems that the reading list’s author used autoplag to extrude the entire 60 page supplemental insert. The author also super-promises this has never happened before.

  • paraphrand@lemmy.world
    link
    fedilink
    English
    arrow-up
    9
    ·
    2 days ago

    AI assistants such as ChatGPT are well-known for creating plausible-sounding errors known as confabulations especially when lacking detailed information on a particular topic.

    No, they are hallucinations or bullshit. I won’t accept any other terms.

    • o7___o7@awful.systemsOP
      link
      fedilink
      English
      arrow-up
      11
      ·
      edit-2
      2 days ago

      If it makes you feel better, I’ve heard good folks like Emily Bender of Stochastic Parrots fame suggest confabulation is a better term. “Hallucination” implies that LLMs have qualia and are accidentally sprinkling falsehoods over a true story. Confabulation better illustrates that it’s producing a bullshit milkshake from its training data that can only be correct accidentally.