I was using Bing to create a list of countries to visit. Since I have been to the majority of the African nation on that list, I asked it to remove the african countries…

It simply replied that it can’t do that due to how unethical it is to descriminate against people and yada yada yada. I explained my resoning, it apologized, and came back with the same exact list.

I asked it to check the list as it didn’t remove the african countries, and the bot simply decided to end the conversation. No matter how many times I tried it would always experience a hiccup because of some ethical process in the bg messing up its answers.

It’s really frustrating, I dunno if you guys feel the same. I really feel the bots became waaaay too tip-toey

  • Vlhacs@reddthat.com
    link
    fedilink
    arrow-up
    20
    ·
    1 year ago

    Bings version of chatgpt once said Vegito was the result of Goku and Vegeta performing the Fusion dance. That’s when I knew it wasn’t perfect. I tried to correct it and it said it didn’t want to talk about it anymore. Talk about a diva.

    Also one time, I asked it to generate a reddit AITA story where they were obviously the asshole. It started typing out “AITA for telling my sister to stop being a drama queen after her miscarriage…” before it stopped midway and, again, said it didn’t want to continue this conversation any longer.

    Very cool tech, but it’s definitely not the end all, be all.

    • intensely_human@lemm.ee
      link
      fedilink
      arrow-up
      10
      ·
      1 year ago

      That’s actually fucking hilarious.

      “Oh I’d probably use the meat grinder … uh I don’t walk to talk about this any more”

    • person4268@lemm.ee
      link
      fedilink
      arrow-up
      8
      ·
      1 year ago

      Bing chat seemingly has a hard filter on top that terminates the conversation if it gets too unsavory by their standards, to try and stop you from derailing it.

    • Silviecat44@vlemmy.net
      link
      fedilink
      arrow-up
      7
      ·
      edit-2
      1 year ago

      I was asking it (binggpt) to generate “short film scripts” for very weird situations (like a transformer that was sad because his transformed form was a 2007 Hyundai Tuscon) and it would write out the whole script, then delete it before i could read it and say that it couldn’t fulfil my request.