• @lemillionsocks@beehaw.org
    link
    fedilink
    281 year ago

    We’ve already got numerous examples of how these ai models and face recognition models tend to have biases or are fed data that accidentally has a racial bias. Its not a stretch of the imagination to see how this can go wrong.

    • Scrubbles
      link
      fedilink
      71 year ago

      Yep, the age old “garbage in garbage out”. If we had a perfect track record we could just send in all the cop data, but we know for a fact the poor and PoC are stopped more than others. You send that into AI it will learn those same biases