Maybe it's a nit, but the advances are in how the chips access memory and are networked together.
> In reality, not all of the AI quintet’s servers would be useless after three years, let alone 12 months. They can keep performing oodles of non-AI work.
Not really. The AI servers are essentially useless for non-AI work.
My understanding is GPUs aren't general purpose. If you have to resort to setting a vast network of cellular atomata to do non-GPU workloads, those servers will get trounced by a raspberry pi that costs pennies in power.
TLDR; Big Tech companies have become worth trillions based on their AI positions. NVIDIA is saying that its new chips almost make their old chips redundant, raising the awkward question “how long should we budget that these chips (that these players each have stacks and stacks of), will last.
Budget the chips wil be redundant in 3 years ? 2 years, 1 year?
If 1-2 years everyone books are cooked and a massive write off is coming at some point (is what’s the article is implying per its back of envelope calcs)
So folks what’s your call? How many years is the real useful life of the NVIDIA chips they’ve all currently bought and filled their server farms with?
State years or months and why you think that timeframe :)
Lots of variables, here. How long does the current AI boom continue? If it's a bust in two years, expect a much longer service life for those GPUs. If it's still going, the question's more interesting. Is it more expensive to build a new datacenter or replace the old GPUs? What about the cost of operating them? How much better are new GPUs?
3 years - knowing nothing I think the chips will likely have a three year useful life. Maybe my intuition is it’s like buying graphics cards for a computer. It’s fast for a couple years.
This is so dumb. A100's were released in 2020 iirc. They are still used in droves and extremely valuable. If meta/goog/msft/whoever want to give me all their A100's I'll organize a truck to go get them.
If you have a truck full of A100s, it's "extremely valuable" but also perishable goods.
Right now, any workable AI hardware is valuable because the market is not presently saturated, and people are in a "buy what you can get" mindset.
Once the market is reasonably saturated, people will get more selective. Older parts will be less desirable-- less efficient, less featureful-- and if you have trucks full of them to dump on the market, it's going to depress the price.
It's like the PC market of 30 years ago. You got a new Pentium-100, but you could still sell on the old 386/16 for a fair amount of cash because for someone else, it beats "no computer". That market doesn't really exist anymore-- today, you may as well just leave a Haswell or Ryzen 1000 box at the curb unless you want to spend 6 weeks dealing with Craigslist flakes for $30.
And when that is the situation they can change their depreciation schedules. The fact remains, the current depreciation schedules for current hardware aren't unreasonable.
> rapid advances in chipmaking
Maybe it's a nit, but the advances are in how the chips access memory and are networked together.
> In reality, not all of the AI quintet’s servers would be useless after three years, let alone 12 months. They can keep performing oodles of non-AI work.
Not really. The AI servers are essentially useless for non-AI work.
software can follow compute.. as gpus become cheaply available in the cloud, there's more and more pressure for software to take advantage of that
My understanding is GPUs aren't general purpose. If you have to resort to setting a vast network of cellular atomata to do non-GPU workloads, those servers will get trounced by a raspberry pi that costs pennies in power.
Yes But CPU and GPU diverged since forever because they do not solve the same set of problems
Can I run k8s on a GPU ? Yes, why not. Will it be efficient ? No.
(replace k8s by "whatever random code you are mostly running")
So we outsource the serial calculations to the cloud but handle branching on our cpus?
TLDR; Big Tech companies have become worth trillions based on their AI positions. NVIDIA is saying that its new chips almost make their old chips redundant, raising the awkward question “how long should we budget that these chips (that these players each have stacks and stacks of), will last.
Budget the chips wil be redundant in 3 years ? 2 years, 1 year?
If 1-2 years everyone books are cooked and a massive write off is coming at some point (is what’s the article is implying per its back of envelope calcs)
So folks what’s your call? How many years is the real useful life of the NVIDIA chips they’ve all currently bought and filled their server farms with?
State years or months and why you think that timeframe :)
Lots of variables, here. How long does the current AI boom continue? If it's a bust in two years, expect a much longer service life for those GPUs. If it's still going, the question's more interesting. Is it more expensive to build a new datacenter or replace the old GPUs? What about the cost of operating them? How much better are new GPUs?
3 years - knowing nothing I think the chips will likely have a three year useful life. Maybe my intuition is it’s like buying graphics cards for a computer. It’s fast for a couple years.
It'll probably take at least 3 years of near 100% utilization just to break even for whomevers buying.
Should be closer to crypto mining no?
This is so dumb. A100's were released in 2020 iirc. They are still used in droves and extremely valuable. If meta/goog/msft/whoever want to give me all their A100's I'll organize a truck to go get them.
If you have a truck full of A100s, it's "extremely valuable" but also perishable goods.
Right now, any workable AI hardware is valuable because the market is not presently saturated, and people are in a "buy what you can get" mindset.
Once the market is reasonably saturated, people will get more selective. Older parts will be less desirable-- less efficient, less featureful-- and if you have trucks full of them to dump on the market, it's going to depress the price.
It's like the PC market of 30 years ago. You got a new Pentium-100, but you could still sell on the old 386/16 for a fair amount of cash because for someone else, it beats "no computer". That market doesn't really exist anymore-- today, you may as well just leave a Haswell or Ryzen 1000 box at the curb unless you want to spend 6 weeks dealing with Craigslist flakes for $30.
And when that is the situation they can change their depreciation schedules. The fact remains, the current depreciation schedules for current hardware aren't unreasonable.