Imagine your search terms, key-strokes, private chats and photographs are being monitored every time they are sent. Millions of students across the U.S. don’t have to imagine this deep surveillance of their most private communications: it’s a reality that comes with their school districts’ decision to install AI-powered monitoring software such as Gaggle and GoGuardian on students’ school-issued machines and accounts.

“As we demonstrated with our own Red Flag Machine, however, this software flags and blocks websites for spurious reasons and often disproportionately targets disadvantaged, minority and LGBTQ youth,” the Electronic Software Foundation (EFF) says.

The companies making the software claim it’s all done for the sake of student safety: preventing self-harm, suicide, violence, and drug and alcohol abuse. While a noble goal, given that suicide is the second highest cause of death among American youth 10-14 years old, no comprehensive or independent studies have shown an increase in student safety linked to the usage of this software. Quite to the contrary: a recent comprehensive RAND research study shows that such AI monitoring software may cause more harm than good.

  • @TexMexBazooka@lemm.ee
    link
    fedilink
    32 months ago

    My company exclusively deploys machines with physical coverings for the camera and hardware disconnects for the mics.

    • Vodulas [they/them]
      link
      fedilink
      82 months ago

      Good! Not all companies do that and I highly doubt school districts in most places will. They tend to be underfunded and understaffed