I was using Bing to create a list of countries to visit. Since I have been to the majority of the African nation on that list, I asked it to remove the african countries…

It simply replied that it can’t do that due to how unethical it is to descriminate against people and yada yada yada. I explained my resoning, it apologized, and came back with the same exact list.

I asked it to check the list as it didn’t remove the african countries, and the bot simply decided to end the conversation. No matter how many times I tried it would always experience a hiccup because of some ethical process in the bg messing up its answers.

It’s really frustrating, I dunno if you guys feel the same. I really feel the bots became waaaay too tip-toey

  • Bappity
    link
    fedilink
    171 year ago

    sometimes it refuses to do anything at all if I mention certain sites that it thinks is piracy and gets all whiney with me >_>

    • AdventureSpoon
      link
      fedilink
      121 year ago

      if we really want useful AI tools they need to be open source and customizable by the user.

      • @djsaskdja@reddthat.com
        link
        fedilink
        English
        61 year ago

        We have those already. It’s just a massive undertaking to turn those tools into something useful for an end-user. I think in the next decade or so we’ll see more open source projects catch on.

        • @kadu@lemmy.world
          link
          fedilink
          English
          31 year ago

          There are a few that can run generative language models locally, with a pretty UI, with zero effort. Download the executable, install it, run it. The issue though is that using your own hardware means running significantly weaker models… So you ask something simple, wait 5 minutes for it to generate an answer, and it’s a super bad answer that looks like an iPhone’s autocorrected sentence.