The Edmonton Police Service announced Tuesday it will become the first police force in the world to use an artificial intelligence (AI) product from Axon Enterprise to trial facial-recognition-enabled bodycams.

“I want to make it clear that this facial-recognition technology will not replace the human component of investigative work,” acting Supt. Kurt Martin with EPS’ information and analytics division said during a news conference.

“In fact, the resemblances that are identified by this software will be human-verified by officers trained in facial recognition.”

Martin said the police force’s goal is to test another tool in its operations toolbox that can help further ensure public and officer safety while also respecting privacy considerations.

Axon Enterprise, an Arizona-based company, develops weapons and technology products for military, law enforcement and civilians in jurisdictions where legal.

  • CanadaPlus@lemmy.sdf.org
    link
    fedilink
    arrow-up
    10
    ·
    edit-2
    7 hours ago

    You know how the police can’t force you to show ID when just walking around? Yeah, this is the same thing and they know it.

  • melsaskca@lemmy.ca
    link
    fedilink
    arrow-up
    17
    ·
    10 hours ago

    The police state has nothing to do with Nationalism I guess. There is big money in that surveillance crap.

  • runsmooth@kopitalk.net
    link
    fedilink
    English
    arrow-up
    10
    ·
    edit-2
    9 hours ago

    Axon’s rep basically says that their mass surveillance cameras don’t see colour, just people. Then follows with the main factor is skin tone (??). A problem that was essentially noted as far back as…2019. What development in the technology is she talking about?

    According to Ann-Li Cooke, Axon Enterprise’s director of responsible AI:

    In response to the report, Cooke said there has been a development in the technology since 2019.

    “There are gaps in both race and gender at that time,” she said. “As we did our due diligence on evaluating multiple models, we were also looking to see if there were race-based differences, and we found that in ideal conditions, that is not the case.

    “Race is not the limiting factor today, the limiting factor is on skin tone. And so when there are varying conditions, such as distance [or] dim lighting, there will be different optical challenges with body-worn camera[s] — and all cameras — in detecting and matching darker-skinned individuals than lighter-skinned individuals.”

    Also note that the facial-recognition technology seems to have a fatal flaw when it comes to women with darker skin.

    However, Gideon Christian, an associate professor of AI and law at the University of Calgary, said the inequities attached to facial-recognition technology are too great to ignore and that he believes there is not enough recent research to suggest any significant improvement.

    “Facial-recognition technology has been shown to have its worst error rate in identifying darker-skinned individuals, especially black females,” he said.

    In some case studies, Christian said facial-recognition technology has shown about a 98 per cent accuracy rate in identifying white male faces, but that it also has about a 35 per cent error rate in identifying darker-skinned women.

    You know what was a problem with the technology back in 2019? LLMs are coded by primarily white males, and their idea for “normal” hard codes bias into the models. These “AI” products essentially show their coders’ bias by discriminating what falls outside of that normal.

    For example, from “How tech’s white male workforce feeds bias into AI”, by Aimee Picchi:

    The report highlights several ways AI programs have created harmful circumstances to groups that already suffer from bias. Among them are:

    An Amazon AI hiring tool that scanned resumes from applicants relied on previous hires’ resumes to set standards for ideal hires. However, the AI started downgrading applicants who attended women’s colleges or who included the word “women’s” in their resumes.
    Amazon’s Rekognition facial analysis program had difficulty identifying dark-skinned women. According to one report, the program misidentified them as men, although the program had no problem identifying men of any skin tone.

    https://www.cbsnews.com/news/ai-bias-problem-techs-white-male-workforce/

    • CanadaPlus@lemmy.sdf.org
      link
      fedilink
      arrow-up
      3
      ·
      7 hours ago

      Hopefully, the EPS has their own server they load faces into domestically. Don’t we have legislation about jurisdiction of storage of biometrics now?

      It’s still not great, and I have little confidence Axon can’t (cooperate with an agency to) slip in there and steal information somehow.

      • Typhoon@lemmy.ca
        link
        fedilink
        arrow-up
        2
        ·
        edit-2
        6 hours ago

        Not true. A few months ago Microsoft admitted US law overrides privacy agreements with other countries or where it’s stored. It was in reference to a situation in France.

        As an American company Axon would be bound the same way.

  • FaceDeer@fedia.io
    link
    fedilink
    arrow-up
    2
    ·
    7 hours ago

    On the plus side, maybe this will encourage police to actually have their body cams on.