• chrash0@lemmy.world
    link
    fedilink
    English
    arrow-up
    8
    ·
    6 months ago

    people generally probably hate the iOS integration just because it’s another AI product, but they’re fundamentally different. the problem with Recall isn’t the AI, it’s the trove of extra data that gets collected that you normally wouldn’t save to disk whereas the iOS features are only accessing existing data that you give it access to.

    from my perspective this is a pretty good use case for “AI” and about as good as you can do privacy wise, if their claims pan out. most features use existing data that is user controlled and local models, and it’s pretty explicit about when it’s reaching out to the cloud.

    this data is already accessible by services on your phone or exists in iCloud. if you don’t trust that infrastructure already then of course you don’t want this feature. you know how you can search for pictures of people in Photos? that’s the terrifying cLoUD Ai looking through your pictures and classifying them. this feature actually moves a lot of that semantic search on device, which is inherently more private.

    of course it does make access to that data easier, so if someone could unlock your device they could potentially get access to sensitive data with simple prompts like “nudes plz”, but you should have layers of security on more sensitive stuff like bank or social accounts that would keep Siri from reading it. likely Siri won’t be able to get access to app data unless it’s specified via their API.

    • MudMan@fedia.io
      link
      fedilink
      arrow-up
      1
      ·
      6 months ago

      Wait, no, that doesn’t sound right. From the way Apple describes this they are accessing all your info, plus extracting context from it. So not only does it know people’s faces, who sent you what when, the content of every image on your device and every message you sent or received, but it knows which people are related to you and how, where you are and a bunch of other stuff.

      Plus there are other issues on the Apple side where it compares worse in terms of privacy. As far as I can tell this doesn’t have an opt-out, right? And they do send the data to Apple servers for processing (but don’t store it), which the MS version doesn’t do at all. It seems like they each have ways in which they’re worse than the other privacy-wise, although presumably the only actually secure option between the two would be Windows with Recall turned off, unless Apple do have an opt-out they’re not talking about.

      Ultimately, like I’ve been telling everyone, the interesting bit here is how the presentation of each of them and the branding and positioning of each brand alter the outcome. Both MS and Apple are arguing the same thing: that your data is secure because their system is secure and your data remains local or at least under your control. But one of them did not pay any mind to presenting security as a concern and will only ship some common sense additional security in response to pushback while the others will ship something very similar but reassuring you in a calm voice that this is all very private even if it’s flying through the ether to an Apple server. So one is “a security and privacy nightmare” and the other one… well, if you have your nudes just sitting in your personal device you’re really just asking for it, you know?

      That is the kind of understanding of marketing that separates Apple from MS, if you ask me. A whole master class in branding right there. I’ll go one further: Based on what I’m reading about this, I suspect if MS had announced their bad, unencrypted leaky version today, after the Apple presentation they would have seen less angry pushback because Apple’s good messaging would have smoothed things over for both.

      Human brains are squishy and weird.

      • mark3748@sh.itjust.works
        link
        fedilink
        English
        arrow-up
        1
        ·
        6 months ago

        Apples AI is mostly processed on device. That’s why it takes an iPhone 15 pro or an M-series processor. They also claim that what is processed in the cloud is neither identifiable nor stored, just processed. We will know if that’s true (at least what is being sent) as soon as it gets out into the public and we can start picking apart the traffic.

        There is no mention of opt-out or not yet, probably because we’re several months away from the actual release. I’m sure we’ll get more information before then.

        • MudMan@fedia.io
          link
          fedilink
          arrow-up
          2
          ·
          edit-2
          6 months ago

          MS’s AI is entirely processed on device. That was their entire security pitch: the data never leaves your PC, why are you all getting so angry about it? Isn’t your PC secure?

          But you didn’t remember that because you were already angry when you read the headlines and that was only two paragraphs down and also it’s a terrible argument that doesn’t resolve any of the valid concerns people had.

          But Apple went out there and talked about sending the name and face of your auntie to their servers along with every email she’s ever sent you for a computer to parse exactly how close you are to her like it’s the best thing that’s happened to your privacy this century. And they sounded like they meant it and were vague enough and they said they pinky promise to not keep any of that info for themselves. And you don’t just remember, you believe it.

          They are really, REALLY good at this, and that’s only helped by how bad Microsoft is at it.

      • chrash0@lemmy.world
        link
        fedilink
        English
        arrow-up
        1
        ·
        6 months ago

        a lot of things are unknown.

        i’d be very surprised if it doesn’t have an opt out.

        a point i was trying to make is that a lot of this info already exists on their servers, and your trust in the privacy of that is what it is. if you don’t trust them that it’s run on per user virtualized compute, that it’s e2e encrypted, or that they’re using local models i don’t know what to tell you. the model isn’t hoovering up your messages and sending them back to Apple unencrypted. it doesn’t need to for these features.

        all that said, this is just what they’ve told us, and there aren’t many people who know exactly what the implementation details are.

        the privacy issue with Recall, as i said, is that it collects a ton of data passively, without explicit consent. if i open my KeePass database on a Recall enabled machine, i have little assurance that this bot doesn’t know my Gmail password. this bot uses existing data, in controlled systems. that’s the difference. sure maybe people see Apple as more trustworthy, but maybe sociology has something to do with your reaction to it as well.

        • MudMan@fedia.io
          link
          fedilink
          arrow-up
          1
          ·
          6 months ago

          No, but hear me out, that’s the thing. That’s why they’re so good.

          I didn’t even consider an opt-out throught that entire presentation.

          They clearly don’t have one, in retrospect. If you go watch it they aren’t even entertaining that option. This is how stuff works now. If you squint, this is how it’s always been, it just hadn’t been deployed yet. It’s magnificently ruthless. They don’t even frame it as a feature that you have that lives in its own space and you could potentially turn on and off.

          This is just how Apple devices work now. It’s just what they do. What do you mean opt out? Could you opt out of the Retina display or the Dynamic Island? It’s just a fact of life.

          They are so good. I can’t believe we all spent a week arguing about an opt out that MS had confirmed day one and Apple was able to entirely bypass the issue for like half an hour and even I, thinking and posting about it, took like an extra day to notice that they are not even surfacing the possibility that you may not want this. That’s some next level marketing wizardry right there.