cross-posted from: https://lemmy.world/post/3320637

YouTube and Reddit are sued for allegedly enabling the racist mass shooting in Buffalo that left 10 dead::The complementary lawsuits claim that the massacre in 2022 was made possible by tech giants, a local gun shop, and the gunman’s parents.

  • @Pyr_Pressure@lemmy.ca
    link
    fedilink
    English
    811 year ago

    I think the thing isn’t just providing access to the content, but using algorithms to promote how likely it is for deranged people to view more and more content that fuel their motives for hateful acts instead of trying to reduce how often that content is seen, all because they make more money if they watch more content, wether it is harmful or not.

    • FlashMobOfOne
      link
      fedilink
      English
      331 year ago

      This.

      I don’t know about Reddit, but YouTube 100% drives engagement by feeding users increasingly flammable and hateful content.

    • @merc@sh.itjust.works
      link
      fedilink
      English
      271 year ago

      Yeah, the difference is in whether or not the company is choosing what to put in front of a viewer’s eyes.

      For the most part an ISP just shows people what they request. If someone gets bomb making directions from YouTube it would be insane to sue AT&T because AT&T delivered the appropriate packets when someone went to YouTube.

      On the other end of the spectrum is something like Fox News. They hire every host, give them timeslots, have the opportunity to vet guests, accept advertising money to run against their content, and so on.

      Section 512 of the DMCA treats “online service providers” like YouTube and Reddit as if they’re just ISPs, merely hosting content that is generated by users. OTOH, YouTube and Reddit use ML systems to decide what the users are shown. In the case of YouTube, the push to suggest content to users is pretty strong. You could argue they’re much closer to the Fox News side of things than to the ISP side these days. There’s no human making the decisions on what content should be shown, but does that matter?

      • @ChillCapybara@discuss.tchncs.de
        link
        fedilink
        English
        91 year ago

        Yep. I often fall asleep to long YouTube videos that are science or history related. The algorithm is the reason why I wake up at 3am to Joe Rogan. It’s like a terrible autocomplete.

    • @assassin_aragorn@lemmy.world
      link
      fedilink
      English
      101 year ago

      Absolutely. I saw a Google ad the other day from maybe PragerU that was about climate change not being real, while I was searching for an old article that was more optimistic about outcomes. They actually said by the ad that they were showing it as a suggested thing, and thankfully you could report it, which I did immediately. It pissed me off a ton.

      A friend recently shared a similar suggested video/ad they got on YouTube, which was saying “Ukrainians are terrorists”. PragerU or TPUSA.

      I can see the argument for allowing these ads to exist as a freedom of speech thing, fine. But actively promoting these ads is very different. The lawsuit would have merits on this. I’d prefer if this content was actively minimized, but at the very least it shouldn’t be promoted.