When the AI bubble pops, what will remain? Cheap GPUs at firesale prices, skilled applied statisticians looking for work, and open source models that already do impressive things, but will grow far more impressive after being optimized:
There were supposed to be cheap GPUs after the crypto bubble burst
Also the AI GPUS probably won’t be great for gaming. And cheap could mean anything when they go for 20k a piece.
Please let the pop take the tech bros with it.
I’m visualising that scene from IT Crowd with the boss stepping out the window…
I wish. I suspect the mega-billionaires will be absolutely fine.
I can strongly recommend the arricle from the OP blog post about marker dynamics and use of what is essentially accounting fraud by major companies involved in AI:
Lifespan of AI Chips: The $300 Billion Question
I am looking forward to reading the research paper they are working on.
While the author takes a relatively neutral tone, the analysis is brutal in its portrayal of major market players (Nvidia, Microsoft, Amazon, Google); they come of more as oligopolists who are happy to engage in what is de facto in an attempt to undermine true market competition.
OP’s post is largely right, but it doesn’t require that link to be true. Also, whether these $3m+ systems are warrantied is a relevant question. It’s hard to know exact lifespan from one person saying their gpu failed quickly. Paper still stands well.
Because of power constraints, I’d expect they replace GPUs every 2 years with new generations, and so there will be big write offs.
A 6 year depreciation schedule seems unrealistically long for a GPU.
Even in gaming terms (I know this is completely different use case), a 2080S from 2019, a high end SKU, would struggle with many modern games at 1440p and higher. A profession streamer would be unlikely to use a 2080S.
Then there is the question of incentives. An objective look at American technology and VC suggests they are far closer to criminal organizations than their treatment by media and US institutions would imply. They very much can be expected to engage in what is essentially accounting fraud.
a 2080S from 2019, a high end SKU, would struggle with many modern games at 1440p and higher. A profession streamer would be unlikely to use a 2080S.
On one hand a 2080s would still be good at doing what it was doing 6 years ago. If there are new needs, and unlimited power availability, then a new card in addition to whatever AI workload the 6 year old GPU can do in addition to the new card makes sense… if that card still works. Selling your 2080s or whatever old card, does mean a fairly steep loss compared to original price, but 6 year depreciation schedule is ok… IF the cards are still working 6 years later.
$3m NVL72 systems are a bit different, as one out of 72 cards burning out can screw up whole system, and datacenter power structure and expertise requirements, would have low resale value, though I assume the cards can be ripped out and sold individually.
They very much can be expected to engage in what is essentially accounting fraud.
Oracle this week “proudly boasted” that they get 30% margins on their datacenter, and stock went up. This is not enough, as it is just 30% over electricity costs. Maintenance/supervision, and gpu costs/rentals don’t count, and it is unlikely that they are profitable, though it’s not so much accounting fraud as it is accounting PR.
Hmm and what about the everyday user who needs to ask AI how long to cook potatoes? What will they do after the pop?
completely uncharted territory. No one tried to cook a potato until Sam Altman graced us plebs with ChatGPT
Local models are actually pretty great! They are not great at everything… but for what most people are using LLMs for they do a fine job.
Thats from llama3.1:8b and the answer is decent, it took about 20seconds to generate my answer and used no more power than if I were to play a video game for the same amount of time.
Boiling time isn’t related to original potato size, it’s related to the size of pieces you cut. So the first half is irrelevant and the second half is overly verbose.
deleted by creator
They’ll ask their parents, or look up cooking instructions on actual websites.
Cue 8 paragraphs of “I first learned to cook potatoes with my grandfather back in…” and every reader screaming “Oh my god just get to the recipe…”
How will I remember to put avocado in my BBQ sauce, without AI?
The sad thing is those GPUs are in specialized boards with specialized servers and cooling, and not really good at the kind of work consumers would want it for, assuming you could even get drivers. So most all of those GPUs will realistically get scrapped if the bubble pops.
Maybe we’ll see the EPYC CPUs get sold secondhand at least, those are socketed.
There are plenty of these, yes. But there are also plenty of consumer-grade GPUs being used for it too.
I’m all fine already with my RX 6400.