Mark Zuckerberg says sorry to families of children who committed suicide — after rejecting suggestion to set up a compensation fund to help the families get counseling::CEOs of Meta, TikTok, Snap, Discord, and X testified at hearing on child safety.

  • angelsomething
    link
    fedilink
    English
    arrow-up
    49
    ·
    edit-2
    10 months ago

    So if they look like lizard people, and speak like lizard people, and when they blink their eyelids move horizontally, doesn’t that make them lizard people? Bunch of cunts, the lot of them. Especially Zuck. Poison of this world and they know it. And by the way, by lizard people I mean literal people that are so distanced from reality that they may well be from another planet.

  • THCDenton@lemmy.world
    link
    fedilink
    English
    arrow-up
    37
    ·
    10 months ago

    Not a fan of the reptilian, but this isn’t fb’s fault. This is on the abusers, the kids that killed themselves and the careless parents.

    • 31337@sh.itjust.works
      link
      fedilink
      English
      arrow-up
      56
      ·
      10 months ago

      Meta could’ve done a lot of things to prevent this. Internal documents show Zuckerberg repeatedly rejected suggestions to improve child safety. Meta lobbies congress to prevent any regulation. Meta controls the algorithms and knows they promote bad behavior such as dog piling, but this bad behavior increases “engagement” and revenue, so they refuse to change it. (Meta briefly changed its algorithms for a few months during the 2020 election to decrease the promotion of disinformation and hate speech, because they were under more scrutiny, but then changed it back after the election).

    • Anyolduser@lemmynsfw.com
      link
      fedilink
      English
      arrow-up
      18
      ·
      10 months ago

      Canada was not available to be blamed.

      It’s down to parenting, or lack thereof. No politician can say “parents of America, quit giving your children unrestricted internet access and being surprised when they see horrible shit” and keep their job.

      Kids don’t need smartphones.

      Sites can be blacklisted on home and school routers.

      Strict parents can be blamed by kids if they catch flak from their peers for not being on social media.

      It ain’t rocket surgery, but you need to be willing to spend time with your kids instead of slapping a phone in front of them to keep them quiet.

      I’ve got a kid that’s magnetically attracted to any screen. I get the temptation but I don’t need a study to tell me that unrestricted internet access is fucking horrible for kids.

      • Doorbook@lemmy.world
        link
        fedilink
        English
        arrow-up
        12
        ·
        10 months ago

        This ignore situation were kids didn’t have social media and abusers post it there. Like sexual assaults and exploitation of childrens.

        Not having a moderated platform with the ability to be private is something the platform should be held responsible for.

        Imagine you have a studium full of fans waiting for the match to start, then someone comes in with a big screen playing a sexual abuse video then leave the stadium. It is normal to sue the stadium for lack of security along with suing the abuser.

        Issues like bullying is harder but when the social network doesn’t remove abuse content they are at fault.

        Facebook remove staff and systematically ignore report of these kinds because it would affect their value.

        Finall note the us government is useless and they do this for show to look cool in front of their voters. EU done more to these corporations.

        • Anyolduser@lemmynsfw.com
          link
          fedilink
          English
          arrow-up
          4
          ·
          edit-2
          10 months ago

          I’m ignoring that situation because we’ve had laws on the books regarding CSAM and ferocious enforcement of them for decades.

      • Meowoem@sh.itjust.works
        link
        fedilink
        English
        arrow-up
        7
        ·
        10 months ago

        There’s a common thing parents do though where they don’t notice the point they lose total control, or lose control totally.

        It’s almost impossible to keep kids from the internet, they can’t stop prisoners getting phones in so what hope do parents have? How do you stop them using an account accessed by school computers, a secret second phone brought second hand or even worse brought for them by a creepy guy online. And if you block the services you know of it’ll push them into ones you’ve never heard of, unmoderated and dangerous places.

        And of course there’s the dream of trust but none of us tell our parents everything, especially when we’ve already gone too far and are embarrassed we broke the trust.

        If you as a kid are going to miss out on what feels like everything that’s happening with your friends then you’ll find a way. Or you’ll get bullied at school by groups the form online and with online memes.

        There needs to be safe places that kids can access social media, just saying they can’t until they’re a certain age won’t work and if it did then it sets them up for a lot of issues on their first day.

        A lot of it is down to parents to teach internet skills and awareness, it’s also down to major platforms that target kids as a key audience to ensure there are effective systems in place to combat and avoid negative situations which might result in a child being harmed.

      • Steve@communick.news
        link
        fedilink
        English
        arrow-up
        3
        ·
        edit-2
        10 months ago

        Now I’m wondering. Is this a potential opportunity for the Fediverse?

        Creating a walled in, heavily moderated social network for kids and teens.
        Parents could be mods.
        Would need some kind of age verification.
        Maybe parents could setup accounts for themselves and their kids.

        Just thinking this over as I type. I don’t know.

      • angrymouse@lemmy.world
        link
        fedilink
        English
        arrow-up
        2
        ·
        edit-2
        10 months ago

        I agree with almost everything but

        It ain’t rocket surgery

        Got me thinking.

        But I also think social networks could ban a lot more

    • stoly@lemmy.world
      link
      fedilink
      English
      arrow-up
      3
      ·
      10 months ago

      So what you’re saying is that victims of bullying are the real problem, not the people being bullies.

  • stoly@lemmy.world
    link
    fedilink
    English
    arrow-up
    26
    ·
    10 months ago

    Why on earth would someone think that this douchenozzle is capable of empathy for humans? He literally stole facebook because he felt entitled to it and had no problems letting governments use it to coordinate genocides THAT HE WAS AWARE OF. No, if there is a hell, this person will be at the top levels of tortured souls and he fully deserves to suffer.

  • AutoTL;DR@lemmings.worldB
    link
    fedilink
    English
    arrow-up
    4
    ·
    10 months ago

    This is the best summary I could come up with:


    During a Senate Judiciary Committee hearing weighing child safety solutions on social media, Meta CEO Mark Zuckerberg stopped to apologize to families of children who committed suicide or experienced mental health issues after using Facebook and Instagram.

    asked Zuckerberg if he had ever apologized and suggested that the Meta CEO personally set up a compensation fund to help the families get counseling.

    Zuckerberg did not agree to set up any compensation fund, but he turned to address families in the crowded audience, which committee chair Dick Durbin (D-Ill.) described as the “largest” he’d ever seen at a Senate hearing.

    Among these bills is the Strengthening Transparency and Obligations to Protect Children Suffering from Abuse and Mistreatment Act (STOP CSAM).

    When that bill was introduced, it originally promised to make platforms liable for "the intentional, knowing, or reckless hosting or storing of child pornography or making child pornography available to any person.” Since then, Durbin has amended the bill to omit the word “reckless” to prevent platforms from interpreting the law as banning end-to-end encryption, Recorded Future News reported.

    Durbin noted that X became the first social media company to publicly endorse the STOP CSAM Act when X CEO Linda Yaccarino agreed to support the bill during today’s hearing.


    The original article contains 414 words, the summary contains 208 words. Saved 50%. I’m a bot and I’m open source!

  • Lutra@lemmy.world
    link
    fedilink
    English
    arrow-up
    3
    ·
    10 months ago

    headline: “We’re still asking some people what they think should be done about the harm they caused.”

    must be nice to get asked what you think you you might want to do about it.