Google promised to keep data from the phone calls private.

  • efstajas@lemmy.world
    link
    fedilink
    arrow-up
    10
    ·
    6 months ago

    God, everyone, read the article, please. The feature in question uses an on-device AI model, meaning none of the audio or transcript leaves the phone. If Google wanted to secretly record and steal your phone transcripts they could do so already. They wouldn’t need this feature.

    • unexposedhazard@discuss.tchncs.de
      link
      fedilink
      arrow-up
      20
      ·
      edit-2
      6 months ago

      Im sorry but are you high or something? You clearly understand enough about computers to know what those words mean, but you didnt even consider that google phones can do whatever they want with data thats “on-device”. Every device with google services has a root backdoor. Ofcourse they will gather all that data, because why wouldnt they? They can also gather it on demand, because it will surely get logged on-device and can be extracted at any time. The rules could also be changed at any time without warning to allow them to collect the data or start using it in whatever ways they want.

      Also even if they wouldnt collect the data, its a fucking horrible idea in the first place. Sure lets outsource trust in our communications to some shitty machine learning algorithm that is dumber than a fucking toddler.

      • efstajas@lemmy.world
        link
        fedilink
        arrow-up
        5
        ·
        edit-2
        6 months ago

        Before I get deeper into this argument, the main point I was trying to make is that people are clearly assuming based on the headline that the transcript analysis happens in the cloud, and aren’t aware of them at least claiming that it’s fully on-device. If Google wants to steal phone transcripts, they can do this already, this feature doesn’t change anything about it.

        Also even if they wouldnt collect the data, its a fucking horrible idea in the first place. Sure lets outsource trust to an algorithm that is dumber than a fucking toddler.

        The privacy discussion aside, the feature is designed to step in and warn the user when it detects a likely scam in progress. I don’t see how this is inherently a bad idea at all. My grandma got scammed on the phone hardcore a few years back — this likely would’ve prevented it. And “outsourcing trust to an algorithm” is clearly not what’s happening here. People get scammed all the time, clearly more needs to be done to stop scams.

        Other than this… I know that people especially here are super wary of google and their privacy-related claims for very good reason. I am too. I know this is a very sensitive topic. But realistically, for this particular discussion…

        “Ofcourse they will gather all that data, because why wouldnt they?”

        There are so, SO many reasons why a massive company like Google, especially one that is constantly under scrutiny for their privacy practices, wouldn’t secretly record / analyze / store / whatever private phone conversations and tbh most probably just aren’t. There is immense regulation around this topic in practically all markets they operate in. If Google was found straight up sending transcripts of phone conversations to their servers without very explicit consent (aka more than some clause in ToS somewhere) it’d realistically be the biggest scandal in Google’s history, and likely significantly hurt, if not kill, at least their phone division. In many markets just the recording of phone conversations is already illegal without consent from both sides, and Google can’t just do it anyway based on some ToS legalese — it’s just illegal.

        I’m not trying to say that I don’t believe they do this because they’re good people or anything, but because from a pure business standpoint it’d be immensely risky for gathering data that is also hardly usable in practice due to how sensitive it is. The circle of people that would even be allowed to know of its existence internally would have to be tiny and extremely trusted to prevent leaks.

        The truth is that they can amass so much data through other potentially dubious yet totally legal ways already, so an immense and illegal overstep of privacy convention like this is just unnecessary.

        • bobotron@lemm.ee
          link
          fedilink
          arrow-up
          3
          ·
          6 months ago

          This is a really well written response and kudos for talking a rational argument up on the Internet

      • EnderMB@lemmy.world
        link
        fedilink
        arrow-up
        5
        ·
        6 months ago

        While I do agree with the premise of your comment, most countries (including the US) have strict and long-standing laws on recording phone conversations. Even if Google wanted to do this, I can see it being an absolute nightmare to egress data from a device onto external storage.

        • unexposedhazard@discuss.tchncs.de
          link
          fedilink
          arrow-up
          4
          ·
          6 months ago

          Unless it doesnt count as “recording” because the information is being transformed by the model in a way that keeps the law from applying to it. But yeah, it will be interesting to see how this might go in court.

      • solo@kbin.earthOP
        link
        fedilink
        arrow-up
        4
        ·
        6 months ago

        Of course they will gather all that data, because why wouldnt they?

        But they promised…

      • dev_null@lemmy.ml
        link
        fedilink
        arrow-up
        3
        ·
        6 months ago

        Sure lets outsource trust in our communications to some shitty machine learning algorithm that is dumber than a fucking toddler.

        And it really doesn’t need to be smarter than that, to show a “banks will never asks you to transfer money to another account, this is likely a scam” dialog when the speaker claiming to be from your bank tells you to do so. This will save lots of vulnerable and older people from getting scammed. If you think that’s a horrible idea then I’m interested in your reasoning. Over $10 billion per year is lost to scams, making a dent in that is amazing.

        • unexposedhazard@discuss.tchncs.de
          link
          fedilink
          arrow-up
          3
          ·
          6 months ago

          If you cant understand why trusting google with anything (let alone a complete black box inside your phone) is a bad idea, there is no point in having a discussion with you.

          • dev_null@lemmy.ml
            link
            fedilink
            arrow-up
            2
            ·
            edit-2
            6 months ago

            I wanted to understand, but it looks like you have no arguments and prefer to just concede your point, fair enough.

              • dev_null@lemmy.ml
                link
                fedilink
                arrow-up
                2
                ·
                edit-2
                6 months ago

                Why are you changing the topic? Let me simplify this.

                You wrote:

                Sure lets outsource trust in our communications to some shitty machine learning algorithm that is dumber than a fucking toddler.

                I disagree this is bad, and my point is: “A dumb machine learning algorithm is enough to be helpful for this purpose of scam warnings.”

                Of course you shouldn’t trust Google in not stealing your data with their implementation, but you have already established that in your original comment, and I didn’t bring this up at all because I already agree with you there. What I disagree with is that this is a bad idea. I think trying to protect people from scams is a great use of AI and would love an open source implementation of it that could be included in GrapheneOS for example.

      • KuroiKaze@lemmy.world
        link
        fedilink
        arrow-up
        2
        ·
        6 months ago

        Actually from what I’ve seen Google has moved hard towards reducing gathered data and encouraging play store apps to use less permissions and data.