Google’s AI-driven Search Generative Experience have been generating results that are downright weird and evil, ie slavery’s positives.

  • @scarabic@lemmy.world
    link
    fedilink
    English
    5010 months ago

    If it’s only as good as the data it’s trained on, garbage in / garbage out, then in my opinion it’s “machine learning,” not “artificial intelligence.”

    Intelligence has to include some critical, discriminating faculty. Not just pattern matching vomit.

    • @samus12345@lemmy.world
      link
      fedilink
      English
      20
      edit-2
      10 months ago

      We don’t yet have the technology to create actual artificial intelligence. It’s an annoyingly pervasive misnomer.

      • Flying Squid
        link
        fedilink
        English
        910 months ago

        And the media isn’t helping. The title of the article is “Google’s Search AI Says Slavery Was Good, Actually.” It should be “Google’s Search LLM Says Slavery Was Good, Actually.”

    • @profdc9@lemmy.world
      link
      fedilink
      English
      910 months ago

      Unfortunately, people who grow up in racist groups also tend to be racist. Slavery used to be considered normal and justified for various reasons. For many, killing someone who has a religion or belief different than you is ok. I am not advocating for moral relativism, just pointing out that a computer learns what is or is not moral in the same way that humans do, from other humans.