• grue@lemmy.world
    link
    fedilink
    English
    arrow-up
    60
    ·
    9 days ago

    When mainstream media starts asking if something is a bubble, it’s not only already been one for quite a while already, but it’s about to pop.

      • LiveLM@lemmy.zip
        link
        fedilink
        English
        arrow-up
        7
        ·
        edit-2
        9 days ago

        Was there a lot of hype surrounding the new launch? I didn’t really keep up with it.

        Regardless, I think it’ll take a bigger disappointment to burst it. Maybe something on the corporate side, like big players not seeing a return of their investment.

        • brucethemoose@lemmy.world
          link
          fedilink
          English
          arrow-up
          6
          ·
          edit-2
          9 days ago

          Ohhh yes. Altmans promotion for it was the Death Star coming up from behind a planet.

          Maybe something on the corporate side, like big players not seeing a return of their investment.

          Ohhh, it is. The big corporate hosters arent making much money and burning cash, and it’s not getting any better as specialized open models eat them from the bottom up.

          GPT-OSS was kinda a flop too, even with decensoring.

          AI isn’t gone, but the corporate side is realizing this is as good a way to make money as selling air.

  • 9point6@lemmy.world
    link
    fedilink
    English
    arrow-up
    30
    ·
    9 days ago

    It’s been a bubble since GPT2 guys, get with the program.

    There is zero chance even half of all these AI product companies still exist in half a decade.

    Now if you don’t mind I reckon I’m gonna Alta Vista search for CDNow and then webvan something from pets.com

  • IrateAnteater@sh.itjust.works
    link
    fedilink
    English
    arrow-up
    9
    ·
    9 days ago

    For some it will be. For the pure AI software companies, yes. For the hardware vendors and data centers, less so. Even if it’s not for generative AI, there will always be need for hyper scale compute.

      • IrateAnteater@sh.itjust.works
        link
        fedilink
        English
        arrow-up
        4
        ·
        9 days ago

        An entire state government could fit it your cellphone. That’s never been one of the use cases for data center level compute.

        • bacon_pdp@lemmy.world
          link
          fedilink
          English
          arrow-up
          3
          ·
          9 days ago

          Ok; what application (which benefits society) requires data center level compute beyond physics simulations (which are better suited for quantum computers).

          • Blue_Morpho@lemmy.world
            link
            fedilink
            English
            arrow-up
            4
            ·
            9 days ago

            An entire state government could run on your phone but requires an entire data center because it’s written in JavaScript that emulates the original COBOL code that ran the government in the 1960’s.

            • bacon_pdp@lemmy.world
              link
              fedilink
              English
              arrow-up
              1
              ·
              9 days ago

              No. A state government needs to support 1/10th of its population actively using its services. Say that state has 10M people; you will want 10k cores for all state services. an 8P server has about 1536 cores and you will need about 7 of them. So it still takes a whole rack even with the COBOL programs and applications written in C and Assembly.

              • Blue_Morpho@lemmy.world
                link
                fedilink
                English
                arrow-up
                4
                ·
                9 days ago

                “State services” is database lookups and billing. Back in the 90’s, I supported 10k users (1.5k active at any moment) on a Pentium 3 with 512MB of Ram.

                • bacon_pdp@lemmy.world
                  link
                  fedilink
                  English
                  arrow-up
                  0
                  ·
                  9 days ago

                  Constraint solvers for things such as Medicaid eligibility; OCR tagging for scanned documents; Anti-AI detection for uploaded images; but yes most state services are data entry and batch processing with web front ends.

                  Also the number of supported users does not scale linearly with the number of CPU cores as Amdahl’s law showed back in 1967.

      • brucethemoose@lemmy.world
        link
        fedilink
        English
        arrow-up
        3
        ·
        edit-2
        9 days ago

        Not a lot? The quirk is they’ve hyper specialized nodes around AI.

        The GPU boxes are useful for some other things, but they will be massively oversupplied, and they mostly aren’t networked like supercomputer clusters.

        Scientists will love the cheap CUDA compute though. I am looking forward to a hardware crash.

        • misk@piefed.social
          link
          fedilink
          English
          arrow-up
          2
          ·
          edit-2
          9 days ago

          That’s what I figured but was open to hearing how data centers won’t go bankrupt when current VC / investor money stops propping up AI arms race. I’m not even sure lots existing hardware won’t go to waste because there’s seemingly not enough power infrastructure to feed it and big tech corpos are building nuclear reactors (on top of restarting coal power plants…). Those reactors might be another silver lining however similar to cheap compute becoming available for scientific applications.

          • brucethemoose@lemmy.world
            link
            fedilink
            English
            arrow-up
            2
            ·
            8 days ago

            because there’s seemingly not enough power infrastructure

            This is overblown. I mean, if you estimate TSMC’s entire capacity and assume every data center GPU they make is full TDP 100% of the time (which is not true), the net consumption isn’t that high. The local power/cooling infrastructure things are more about corpo cost cutting.

            Altman’s preaching that power use will be exponential is a lie that’s already crumbling.

            But there is absolutely precedent for underused hardware flooding the used markets, or getting cheap on cloud providers. Honestly this would be incredible for the local inference community, as it would give tinkerers (like me) actually affordable access to experiment with.

    • brucethemoose@lemmy.world
      link
      fedilink
      English
      arrow-up
      3
      ·
      edit-2
      9 days ago

      I mean, GPU box hardware prices will plument if there’s a crash, like they did with crypto GPU mining.

      That’s how I got my AMD 7950 for peanuts. And a Nvidia 980 TI!

      I am salivating over this. I am so in for a fire sale MI300 or A100.

  • etherphon@piefed.world
    link
    fedilink
    English
    arrow-up
    8
    ·
    edit-2
    9 days ago

    Is it really a boom when it’s literally forced upon you unwillingly? Shoehorned into all the products you used to like?