• GreatAlbatross@feddit.uk
    link
    fedilink
    English
    arrow-up
    72
    ·
    3 days ago

    Or from the sounds of it, doing things more efficiently.
    Fewer cycles required, less hardware required.

    Maybe this was an inevitability, if you cut off access to the fast hardware, you create a natural advantage for more efficient systems.

    • sugar_in_your_tea@sh.itjust.works
      link
      fedilink
      English
      arrow-up
      41
      ·
      3 days ago

      That’s generally how tech goes though. You throw hardware at the problem until it works, and then you optimize it to run on laptops and eventually phones. Usually hardware improvements and software optimizations meet somewhere in the middle.

      Look at photo and video editing, you used to need a workstation for that, and now you can get most of it on your phone. Surely AI is destined to follow the same path, with local models getting more and more robust until eventually the beefy cloud services are no longer required.

      • jmcs@discuss.tchncs.de
        link
        fedilink
        English
        arrow-up
        46
        ·
        3 days ago

        The problem for American tech companies is that they didn’t even try to move to stage 2.

        OpenAI is hemorrhaging money even on their most expensive subscription and their entire business plan was to hemorrhage money even faster to the point they would use entire power stations to power their data centers. Their plan makes about as much sense as digging your self out of a hole by trying to dig to the other side of the globe.

        • sugar_in_your_tea@sh.itjust.works
          link
          fedilink
          English
          arrow-up
          17
          ·
          3 days ago

          Hey, my friends and I would’ve made it to China if recess was a bit longer.

          Seriously though, the goal for something like OpenAI shouldn’t be to sell products to end customers, but to license models to companies that sell “solutions.” I see these direct to consumer devices similarly to how GPU manufacturers see reference cards or how Valve sees the Steam Deck: they’re a proof of concept for others to follow.

          OpenAI should be looking to be more like ARM and less like Apple. If they do that, they might just grow into their valuation.