throwawayffffas
8 hours ago
A thought occurs. GPUs have a limited lifespan. Gpus die after 1-3 years of use.[1] Just in time to train an LLM or two. The data centers themselves without the GPUS is like 40% of the cost from what I hear. In 3 years from now when the boom ends, they are going to be empty warehouses, with very good networking and cooling.
[1]. https://ithy.com/article/data-center-gpu-lifespan-explained-...
adityaathalye
7 hours ago
True, they burn through GPUs. However, I wonder what the actual curve looks like... what fraction of total GPU capacity is getting maxed out, to the 3 year "burned to crisp" threshold. Training is harsher than inference is harsher than speculative capacity-hoarding (because, competition).
Even after that, what does a "burned out" GPU look like. Is it a total bust, or is still usable at... say, 25% capacity for "consumer type applications"?
Thank you for that GPU lifespan explanation... taught me a thing or two today.
throwawayffffas
7 hours ago
> Even after that, what does a "burned out" GPU look like. Is it a total bust, or is still usable at... say, 25% capacity for "consumer type applications"?
From what a hear it's a mix, of completely dead to degraded performance.
> Training is harsher than inference is harsher than speculative capacity-hoarding (because, competition).
I have heard over 70% quoted used for training, and like 5% for general purpose inference and the rest for code generation. But don't quote me on these numbers, I don't recall the sources. One has to assume that some capacity is also used for traditional high performance computing.
adityaathalye
4 hours ago
Still, I do wonder about the GPU manufacturing capacity upstream of datacenters, even though it grows relatively slowly. I suppose NVIDIA's order book is booked solid a few years out. However, capacity that they add can't just be repurposed / retooled for other use cases.
What could substitute LLM demand, if the LLM/AI business contracts rapidly?
adityaathalye
4 hours ago
Educated guesstimates are worth a lot. Thank you for the stats!