I asked Google Bard whether it thought Web Environment Integrity was a good or bad idea. Surprisingly, not only did it respond that it was a bad idea, it even went on to urge Google to drop the proposal.

  • notfromhere
    link
    fedilink
    arrow-up
    6
    ·
    1 year ago

    Hey, that comment’s a bit off the mark. Transformers don’t just memorize chunks of text, they’re way more sophisticated than that. They use attention mechanisms to figure out what parts of the text are important and how they relate to each other. It’s not about memorizing, it’s about understanding patterns and relationships. The paper you linked doesn’t say anything about these models just regurgitating information.

    • graham1@gekinzuku.com
      link
      fedilink
      English
      arrow-up
      4
      ·
      1 year ago

      I believe your “They use attention mechanisms to figure out which parts of the text are important” is just a restatement of my “break it into contextual chunks”, no?