• Communist@lemmy.ml
    link
    fedilink
    English
    arrow-up
    28
    ·
    2 years ago

    Does this really even matter for combatting this? were pedos really so stupid that they were putting their shit on the cloud?

    I’m all for stopping pedophiles but this seems like a scary breach of privacy, is an apple person going to look at my young-looking but completely legal nudes? This seems scary.

  • balerion@beehaw.org
    link
    fedilink
    English
    arrow-up
    24
    ·
    2 years ago

    Combating CSAM is great and all, but something tells me this will also be used for far more sinister purposes.

    • mrbruh
      link
      fedilink
      English
      arrow-up
      9
      ·
      2 years ago

      Always has, always will be unfortunately. It’s a classic “Think of the children” change

    • mrbruh
      link
      fedilink
      arrow-up
      3
      ·
      2 years ago

      Don’t worry, most social media’s already do that free of charge 😁

  • evalda@lemmy.ml
    link
    fedilink
    English
    arrow-up
    4
    ·
    2 years ago

    This would scan regardless of whether iCloud is enabled or not. But only for minors. Correct?

  • Sagar Acharya@sopuli.xyz
    link
    fedilink
    English
    arrow-up
    3
    ·
    2 years ago

    For any detection logic one has to take the video, .i.e. the series of images in, feed to logic of code, and return results. For a hidden or very big program, the code can as well be,

    if nudity send_to_apple_server() print(“We detected nudity, and flagged this video”)

    The user cannot differentiate it from well intended code. The right thing to do is not track at all! No “SMART” logic to “HELP”!