Who would’ve thought? This isn’t going to fly with the EU.

Article 5.3 of the Digital Markets Act (DMA): “The gatekeeper shall not prevent business users from offering the same products or services to end users through third-party online intermediation services or through their own direct online sales channel at prices or conditions that are different from those offered through the online intermediation services of the gatekeeper.”

Friendly reminder that you can sideload apps without jailbreaking or paying for a dev account using TrollStore, which utilises core trust bugs to bypass/spoof some app validation keys, on a iPhone XR or newer on iOS 14.0 up to 16.6.1. (ANY version for iPhone X and older)

Install guide: Trollstore

  • OsrsNeedsF2P@lemmy.ml
    link
    fedilink
    English
    arrow-up
    9
    ·
    edit-2
    10 months ago

    Closed source software can’t be audited, so it can’t be secure. If software isn’t secure, the exploits rid it of any privacy.

    See: The bimonthly remote takeover bugs that keep getting found. Like this one: https://citizenlab.ca/2023/09/blastpass-nso-group-iphone-zero-click-zero-day-exploit-captured-in-the-wild/

    “Oh whoopsy doopsy, looks like your iPhone, camera, files, GPS and more were accessible to someone who sent you an iMessage… for the third time this year”

    • BorgDrone
      link
      fedilink
      English
      arrow-up
      1
      ·
      10 months ago

      Closed source software can’t be audited, so it can’t be secure

      That’s the biggest load of bullshit I’ve ever heard.

      Closed source software is audited all the time.

      • OsrsNeedsF2P@lemmy.ml
        link
        fedilink
        English
        arrow-up
        1
        ·
        edit-2
        10 months ago

        Ok let me rephrase - nobody without a conflict of interest can audit a closed source application. If Microsoft paid for an audit of Windows, that doesn’t tell you anything about whether or not Windows is backdoored.

        • BorgDrone
          link
          fedilink
          English
          arrow-up
          1
          ·
          10 months ago

          The audit is not for you. Closed source software is audited all the time, but the results of those audits are generally confidential. This is about finding security bugs, not deliberate backdoors.

          The key with this is who do you trust. Sure, open source can be audited by everyone, but is it? You can’t audit all the code you use yourself, even if you have the skills, it’s simply too much. So you still need to trust another person or company, it really doesn’t change the equation that much.

          • OsrsNeedsF2P@lemmy.ml
            link
            fedilink
            English
            arrow-up
            1
            ·
            edit-2
            10 months ago

            In practice, most common open source software is used and contributed to by hundreds of people. So it naturally does get audited by that process. Closed source software can’t be confirmed to not be malicious, so it can’t be confirmed to be secure, so back to my original point, it can’t be private.

            I didn’t go into that much detail in my original comment, but it was what I meant when I first wrote it. As far as “does everyone audit the software they use”, the answer is obviously no. But, the software I use is mostly FOSS and contributed to by dozens of users, sometimes including myself. So when alarms are rung over the smallest things, you have a better idea of the attack vectors and privacy implications.

            • BorgDrone
              link
              fedilink
              English
              arrow-up
              1
              ·
              10 months ago

              In practice, most common open source software is used and contributed to by hundreds of people. So it naturally does get audited by that process.

              Just working on software is not the same as actively looking for exploits. Software security auditing requires a specialised set of skills. Open source also makes it easier for black-hat hackers to find exploits.

              Hundreds of people working on something is a double-edged sword. It also makes it easy for someone to sneak in an exploit. A single-character mistake in code could cause an exploitable bug, and if you are intent on deliberately introducing such an issue it can be very hard to spot and even if caught can be explained away as an honest to god mistake.

              By contrast, lots of software companies screen their employees, especially if they are working on critical code.

              • OsrsNeedsF2P@lemmy.ml
                link
                fedilink
                English
                arrow-up
                1
                ·
                edit-2
                10 months ago

                I don’t know if you really believe what you’re saying, but I’ll continue answering anyways. I worked at Manulife, the largest private insurance company in Canada, and ignoring the fact our security team was mostly focused on pen testing (which as you know, in contrast to audits tells you nothing about whether a system is secure), but the audits were infrequent and limited in scope. Most corporations don’t even do audits (and hire the cheapest engineers to do the job), and as a consumer, there’s no way to easily tell which audits covered the security aspects you care about.

                If you want to talk about the security of open source more, besides what is already mentioned above, not only are Google, Canonical and RedHat growing their open source security teams (combined employing close to 1,000 people whose job is to audit and patch popular open source apps), but also open source projects can likewise pay for audits themselves (See Mullvad or Monero as examples).

                I will concede that it is possible for proprietary software to be secure. But in practice, it’s simply not, and too hard to tell. It’s certainly not secure when compared to similar open source offerings.