There were a number of exciting announcements from Apple at WWDC 2024, from macOS Sequoia to Apple Intelligence. However, a subtle addition to Xcode 16 — the development environment for Apple platforms, like iOS and macOS — is a feature called Predictive Code Completion. Unfortunately, if you bought into Apple’s claim that 8GB of unified memory was enough for base-model Apple silicon Macs, you won’t be able to use it. There’s a memory requirement for Predictive Code Completion in Xcode 16, and it’s the closest thing we’ll get from Apple to an admission that 8GB of memory isn’t really enough for a new Mac in 2024.

  • @Jtee@lemmy.world
    link
    fedilink
    English
    1605 months ago

    And now all the fan boys and girls will go out and buy another MacBook. That’s planned obsolescence for ya

    • @bamboo@lemm.ee
      link
      fedilink
      English
      645 months ago

      Someone who is buying a MacBook with the minimum specs probably isn’t the same person that’s going to run out and buy another one to get one specific feature in Xcode. Not trying to defend Apple here, but if you were a developer who would care about this, you probably would have paid for the upgrade when you bought it in the first place (or couldn’t afford it then or now).

      • @TheGrandNagus@lemmy.world
        link
        fedilink
        English
        255 months ago

        Well no, not this specific scenario, because of course devs will generally buy machines with more RAM.

        But there are definitely people who will buy an 8GB Apple laptop, run into performance issues, then think “oh I must need to buy a new MacBook”.

        If Apple didn’t purposely manufacture ewaste-tier 8GB laptops, that would be minimised.

        • @narc0tic_bird@lemm.ee
          link
          fedilink
          English
          9
          edit-2
          5 months ago

          I wouldn’t be so sure. I feel like many people would not buy another MacBook if it were to feel a lot slower after just a few years.

          This feels like short term gains vs. long term reputation.

    • m-p{3}
      link
      fedilink
      English
      315 months ago

      And why they solder the RAM, or even worse make it part of the SoC.

      • @rockSlayer@lemmy.world
        link
        fedilink
        English
        515 months ago

        There are real world performance benefits to ram being as close as possible to the CPU, so it’s not entirely without merit. But that’s what CAMM modules are for.

        • @akilou@sh.itjust.works
          link
          fedilink
          English
          255 months ago

          But do those benefits outweigh doubling or tripling the amount of RAM by simply inserting another stick that you can buy for dozens of dollars?

          • @rockSlayer@lemmy.world
            link
            fedilink
            English
            195 months ago

            That’s extremely dependent on the use case, but in my opinion, generally no. However CAMM has been released as an official JEDEC interface and does a good job at being a middle ground between repairability and speed.

            • @halcyoncmdr@lemmy.world
              link
              fedilink
              English
              205 months ago

              It’s an officially recognized spec, so Apple will ignore it as long as they can. Until they can find a way to make money from it or spin marketing as if it’s some miraculous new invention of theirs, for something that should just be how it’s done.

              • umami_wasabi
                link
                fedilink
                English
                95 months ago

                Parts pairing will do. That’s what Apple known for, knee capping consumer rights.

          • @BorgDrone
            link
            English
            95 months ago

            Yes, there are massive advantages. It’s basically what makes unified memory possible on modern Macs. Especially with all the interest in AI nowadays, you really don’t want a machine with a discrete GPU/VRAM, a discrete NPU, etc.

            Take for example a modern high-end PC with an RTX 4090. Those only have 24GB VRAM and that VRAM is only accessible through the (relatively slow) PCIe bus. AI models can get really big, and 24GB can be too little for the bigger models. You can spec an M2 Ultra with 192GB RAM and almost all of it is accessible by the GPU directly. Even better, the GPU can access that without any need for copying data back and forth over the PCIe bus, so literally 0 overhead.

            The advantages of this multiply when you have more dedicated silicon. For example: if you have an NPU, that can use the same memory pool and access the same shared data as the CPU and GPU with no overhead. The M series also have dedicated video encoder/decoder hardware, which again can access the unified memory with zero overhead.

            For example: you could have an application that replaces the background on a video using AI. It takes a video, decompresses it using the video decoder , the decompressed video frames are immediately available to all other components. The GPU can then be used to pre-process the frames, the NPU can use the processed frames as input to some AI model and generate a new frame and the video encoder can immediately access that result and compress it into a new video file.

            The overhead of just copying data for such an operation on a system with non-unified memory would be huge. That’s why I think that the AI revolution is going to be one of the driving factors in killing systems with non-unified memory architectures, at least for end-user devices.

          • FarraigePlaisteach
            link
            fedilink
            English
            15 months ago

            And even if the out-of-the-box RAM is soldered to the machine, it should still be possible to add supplementary RAM that isn’t soldered for when the system demands it. Other computers have worked like this in the past with chip RAM but a socket to add more.

          • @gravitas_deficiency@sh.itjust.works
            link
            fedilink
            English
            15 months ago

            It’s highly dependent on the application.

            For instance, I could absolutely see having certain models with LPCAMM expandability as a great move for Apple, particularly in the pro segment, so they’re not capped by whatever they can cram into their monolithic SoCs. But for most consumer (that is, non-engineer/non-developer users) applications, I don’t see them making it expandable.

            Or more succinctly: they should absolutely put LPCAMM in the next generation of MBPs, in my opinion.

        • @TheGrandNagus@lemmy.world
          link
          fedilink
          English
          75 months ago

          Apple’s SoC long predates CAMM.

          Dell first showed off CAMM in 2022, and it only became JEDEC standardised in December 2023.

          That said, if Dell can create a really good memory standard and get JEDEC to make it an industry standard, so can Apple. They just chose not to.

        • umami_wasabi
          link
          fedilink
          English
          105 months ago

          Well. The claim they made still holds true, despit how I dislike this design choice. It is faster, and more secure (though attacks on NAND chips are hard and require high skill levels that most attacker won’t posses).

          And add one more: it saves power when using LPDDR5 rather DDR5. To a laptop that battery life matters a lot, I agree that’s important. However, I have no idea how much standby or active time it gain by using LPDDR5.

      • Balder
        link
        fedilink
        English
        9
        edit-2
        5 months ago

        In this particular case the RAM is part of the chip as an attempt to squeeze more performance. Nowadays, processors have become too fast but it’s useless if the rest of the components don’t catch up. The traditional memory architecture has become a bottleneck the same way HDDs were before the introduction of SSDs.

        You’ll see this same trend extend to Windows laptops as they shift to Snapdragon processors too.

    • @Mongostein@lemmy.ca
      link
      fedilink
      English
      125 months ago

      And the apple haters will keep making this exact same comment on every post using their 3rd laptop in ten years while I’m still using my 2014 MacBook daily with no issues.

      Be more original.

      • @Jtee@lemmy.world
        link
        fedilink
        English
        65 months ago

        Nice attempt to justify planned obsolescence. To think apple hasn’t done this time and time again, you’d have to be a fool

          • @Honytawk@lemmy.zip
            link
            fedilink
            English
            25 months ago

            At which point did Apple decide your MacBook was too old to be usable and stop giving updates or allow new software to run on it?

            • @Mongostein@lemmy.ca
              link
              fedilink
              English
              3
              edit-2
              5 months ago

              Still gets security updates. All the software I need to run on it runs on it.

              My email, desktop, and calendar all still sync with my newer desktop. I can still play StarCraft. I can join zoom meetings while running Roll 20. I can even run Premiere and do video editing… to a point.

              I guess if you need the latest and greatest then you might have a point, but I don’t.

              This whole thread is bitching about software bloat and Apple does that to stop the software bloat on older machines, but noooo that’s planned obsolescence. 🙄

      • @stoly@lemmy.world
        link
        fedilink
        English
        55 months ago

        This is pretty much it. People really just want to find reasons to hate Apple over the past 2 - 3 years. You’re right, though, your Mac can run easily for 10+ years. You’re good basically until the web browsers no longer support your OS version, which is more in the 12-15 year range.

        • @theneverfox@pawb.social
          link
          fedilink
          English
          25 months ago

          In fairness, most computers built after around 2014-2016+ last way longer, performance started to level off not long after that. After all, devs write software for what people have, if everyone had 128 gigs of RAM we’d load everything we could think of into memory and you’d need it to keep up

          Macs did have some incredible build quality though, the newer ones aren’t holding up even close to as well. I’m still using a couple 2012 Macs to play videos, it’s slow as hell when you interact, but once the video is playing it still looks and sounds good

      • @Honytawk@lemmy.zip
        link
        fedilink
        English
        1
        edit-2
        5 months ago

        I still have a fully functioning Windows 95 machine.

        My daily driver desktop is also from around 2014.

    • @Lucidlethargy@sh.itjust.works
      link
      fedilink
      English
      45 months ago

      These were obsolete the minute they were made, though… So it’s not really planned obsolescence. I got one for free (MacBook Air), and it’s always been trash.

      • @bamboo@lemm.ee
        link
        fedilink
        English
        35 months ago

        I have an M2 MBA and it’s the best laptop I’ve ever owned or used, second to the M3 Max MBP I get to use for work. Silent, battery lasts all week, interface is fast and runs all my dev tools like a charm. Zero issues with the device.