cross-posted from: https://lemmy.world/post/3320637

YouTube and Reddit are sued for allegedly enabling the racist mass shooting in Buffalo that left 10 dead::The complementary lawsuits claim that the massacre in 2022 was made possible by tech giants, a local gun shop, and the gunman’s parents.

  • @curiousaur@reddthat.com
    link
    fedilink
    English
    1311 year ago

    This is so so stupid. We should also sue the ISPs then, they enabled the use of YouTube and Reddit. And the phone provider for enabling communications. This is such a dangerous slippery slope to put any blame on the platforms.

    • @Pyr_Pressure@lemmy.ca
      link
      fedilink
      English
      811 year ago

      I think the thing isn’t just providing access to the content, but using algorithms to promote how likely it is for deranged people to view more and more content that fuel their motives for hateful acts instead of trying to reduce how often that content is seen, all because they make more money if they watch more content, wether it is harmful or not.

      • FlashMobOfOne
        link
        fedilink
        English
        331 year ago

        This.

        I don’t know about Reddit, but YouTube 100% drives engagement by feeding users increasingly flammable and hateful content.

      • @merc@sh.itjust.works
        link
        fedilink
        English
        271 year ago

        Yeah, the difference is in whether or not the company is choosing what to put in front of a viewer’s eyes.

        For the most part an ISP just shows people what they request. If someone gets bomb making directions from YouTube it would be insane to sue AT&T because AT&T delivered the appropriate packets when someone went to YouTube.

        On the other end of the spectrum is something like Fox News. They hire every host, give them timeslots, have the opportunity to vet guests, accept advertising money to run against their content, and so on.

        Section 512 of the DMCA treats “online service providers” like YouTube and Reddit as if they’re just ISPs, merely hosting content that is generated by users. OTOH, YouTube and Reddit use ML systems to decide what the users are shown. In the case of YouTube, the push to suggest content to users is pretty strong. You could argue they’re much closer to the Fox News side of things than to the ISP side these days. There’s no human making the decisions on what content should be shown, but does that matter?

        • @ChillCapybara@discuss.tchncs.de
          link
          fedilink
          English
          91 year ago

          Yep. I often fall asleep to long YouTube videos that are science or history related. The algorithm is the reason why I wake up at 3am to Joe Rogan. It’s like a terrible autocomplete.

      • @assassin_aragorn@lemmy.world
        link
        fedilink
        English
        101 year ago

        Absolutely. I saw a Google ad the other day from maybe PragerU that was about climate change not being real, while I was searching for an old article that was more optimistic about outcomes. They actually said by the ad that they were showing it as a suggested thing, and thankfully you could report it, which I did immediately. It pissed me off a ton.

        A friend recently shared a similar suggested video/ad they got on YouTube, which was saying “Ukrainians are terrorists”. PragerU or TPUSA.

        I can see the argument for allowing these ads to exist as a freedom of speech thing, fine. But actively promoting these ads is very different. The lawsuit would have merits on this. I’d prefer if this content was actively minimized, but at the very least it shouldn’t be promoted.

    • @PoliticalAgitator@lemm.ee
      link
      fedilink
      English
      281 year ago

      If you were head of a psychiatric ward and had an employee you knew was telling patients “Boy, I sure wish someone would kill as many black people as they could”, you would absolutely share responsibility when on of them did exactly that.

      If you were deliberately pairing that employee with patients who had shown violent behaviour on the basis of “they both seem to like violence”, you would absolutely share responsibility for that violence.

      This isn’t a matter of “there’s just so much content, however can we check it all?”.

      Reddit has hosted multiple extremist and dangerous communities, claiming “we’re just the platform!” while handing over the very predictable post histories of mass shooters week after week.

      YouTube has built an algorithm and monetisation system that is deliberately designed to lure people down rabbit holes then done nothing to stop it luring people towards domestic terrorism.

      It’s a lawsuit against companies worth billions. They’re not being executed. There are grounds to accuse them of knowingly profiting from the grooming of terrorists and if they want to prove that’s not the case, they can do it in court.

    • @firadin@lemmy.world
      link
      fedilink
      English
      231 year ago

      Do ISPs actively encourage you to watch extremist content? Do they push that content toward people who are at risk of radicalization to get extra money?

    • narshee
      link
      fedilink
      English
      171 year ago

      I think to blame/sue the company that is nearest to the user should work fine. (following is hyperbolical) If you don’t do it that way, then yes it would be slippery because the big bang would need to be sued. But that makes no sense.

      • @curiousaur@reddthat.com
        link
        fedilink
        English
        251 year ago

        So if an attack is planned via mail you think we should sue the postal service? The phone company if it’s done over the phone?

        • narshee
          link
          fedilink
          English
          20
          edit-2
          1 year ago

          No, because these things should be private. Social media however needs some kind of moderation. edit: also go blame the user too, but that should be a given

          • @curiousaur@reddthat.com
            link
            fedilink
            English
            11 year ago

            I think just the poster should suffice, we should leave the platforms out of it. If anything, it helps to out the assholes who would post stuff that enables this.

            • narshee
              link
              fedilink
              English
              41 year ago

              Blocking a user and removing content from a platform should be relatively easy and fast which should prevent organized crimes. Sueing someone afterwords takes way more resources and time.

              But a platform can remove content without getting sued. Why sue them too? Because if you don’t sue their asses they don’t care.

              Of course moderation takes time and can’t be perfect and this should be considered when suing the platform owners. And yes this could help the assholes, but I think you can report such behavior to the fbi or someone.

        • hypelightfly
          link
          fedilink
          131 year ago

          Change mail (private) to moderated public notice board (not private). The owner of the public notice board should probably be sued for allowing the content to stay up.

        • @Uncle_Bagel@midwest.social
          link
          fedilink
          English
          121 year ago

          If my buddies and spend a month plotting a crimer in my cousin’s spare room, the cousin would be complicit since he knowingly allowed us to use his property for a criminal conspiracy. The USPS doesn’t know what i am sending in the mail since they are a common carrier.

        • @Esqplorer@lemmy.zip
          link
          fedilink
          English
          31 year ago

          Is the postal service intentionally increasing mail to people interested in attacks by people messaging that attacks are necessary? If the postal service is doing that to increase the total postal volume, then yes, we should.

    • sour
      link
      fedilink
      121 year ago

      the isps don’t encourage people to see content that makes them mad