When the AI bubble pops, what will remain? Cheap GPUs at firesale prices, skilled applied statisticians looking for work, and open source models that already do impressive things, but will grow far more impressive after being optimized:

    • boonhet@sopuli.xyz
      link
      fedilink
      English
      arrow-up
      1
      ·
      4 days ago

      Also the AI GPUS probably won’t be great for gaming. And cheap could mean anything when they go for 20k a piece.

  • Alphane Moon@lemmy.world
    link
    fedilink
    English
    arrow-up
    16
    ·
    5 days ago

    I can strongly recommend the arricle from the OP blog post about marker dynamics and use of what is essentially accounting fraud by major companies involved in AI:

    Lifespan of AI Chips: The $300 Billion Question

    I am looking forward to reading the research paper they are working on.

    While the author takes a relatively neutral tone, the analysis is brutal in its portrayal of major market players (Nvidia, Microsoft, Amazon, Google); they come of more as oligopolists who are happy to engage in what is de facto in an attempt to undermine true market competition.

    • humanspiral@lemmy.ca
      link
      fedilink
      English
      arrow-up
      2
      ·
      4 days ago

      OP’s post is largely right, but it doesn’t require that link to be true. Also, whether these $3m+ systems are warrantied is a relevant question. It’s hard to know exact lifespan from one person saying their gpu failed quickly. Paper still stands well.

      Because of power constraints, I’d expect they replace GPUs every 2 years with new generations, and so there will be big write offs.

      • Alphane Moon@lemmy.world
        link
        fedilink
        English
        arrow-up
        2
        ·
        4 days ago

        A 6 year depreciation schedule seems unrealistically long for a GPU.

        Even in gaming terms (I know this is completely different use case), a 2080S from 2019, a high end SKU, would struggle with many modern games at 1440p and higher. A profession streamer would be unlikely to use a 2080S.

        Then there is the question of incentives. An objective look at American technology and VC suggests they are far closer to criminal organizations than their treatment by media and US institutions would imply. They very much can be expected to engage in what is essentially accounting fraud.

        • humanspiral@lemmy.ca
          link
          fedilink
          English
          arrow-up
          2
          ·
          4 days ago

          a 2080S from 2019, a high end SKU, would struggle with many modern games at 1440p and higher. A profession streamer would be unlikely to use a 2080S.

          On one hand a 2080s would still be good at doing what it was doing 6 years ago. If there are new needs, and unlimited power availability, then a new card in addition to whatever AI workload the 6 year old GPU can do in addition to the new card makes sense… if that card still works. Selling your 2080s or whatever old card, does mean a fairly steep loss compared to original price, but 6 year depreciation schedule is ok… IF the cards are still working 6 years later.

          $3m NVL72 systems are a bit different, as one out of 72 cards burning out can screw up whole system, and datacenter power structure and expertise requirements, would have low resale value, though I assume the cards can be ripped out and sold individually.

          They very much can be expected to engage in what is essentially accounting fraud.

          Oracle this week “proudly boasted” that they get 30% margins on their datacenter, and stock went up. This is not enough, as it is just 30% over electricity costs. Maintenance/supervision, and gpu costs/rentals don’t count, and it is unlikely that they are profitable, though it’s not so much accounting fraud as it is accounting PR.

  • Zkuld@lemmy.world
    link
    fedilink
    English
    arrow-up
    11
    ·
    5 days ago

    Hmm and what about the everyday user who needs to ask AI how long to cook potatoes? What will they do after the pop?

    • rem26_art@fedia.io
      link
      fedilink
      arrow-up
      18
      ·
      5 days ago

      completely uncharted territory. No one tried to cook a potato until Sam Altman graced us plebs with ChatGPT

    • cubism_pitta@lemmy.world
      link
      fedilink
      English
      arrow-up
      14
      ·
      5 days ago

      Local models are actually pretty great! They are not great at everything… but for what most people are using LLMs for they do a fine job.

      Thats from llama3.1:8b and the answer is decent, it took about 20seconds to generate my answer and used no more power than if I were to play a video game for the same amount of time.

      • br3d@lemmy.world
        link
        fedilink
        English
        arrow-up
        12
        ·
        5 days ago

        Cue 8 paragraphs of “I first learned to cook potatoes with my grandfather back in…” and every reader screaming “Oh my god just get to the recipe…”

  • Glitchvid@lemmy.world
    link
    fedilink
    English
    arrow-up
    6
    arrow-down
    1
    ·
    edit-2
    5 days ago

    The sad thing is those GPUs are in specialized boards with specialized servers and cooling, and not really good at the kind of work consumers would want it for, assuming you could even get drivers. So most all of those GPUs will realistically get scrapped if the bubble pops.

    Maybe we’ll see the EPYC CPUs get sold secondhand at least, those are socketed.