One point she makes is that these chips don't last forever and must be replaced every couple years at least. That there is a cycle. Nvidia is, imo, trying to replicate the iphione model, only with chips for enterprise rather than smart phones for consumers. For example, Nvidia went from a 2-year upgrade cycle to a 1 year, just like iPhones started as an every-other year upgrade cycle (with an S transitionary year) to annual upgrades. In fact, the iPhone gets at least a few new models each year, including regular, pro, and pro max. Nvidia is also trying to turn Cuda into the App Store of their platform. There's even the premium market parallel, ie, iPhone and Nvidia GPUs are the most expensive of their ilk. The way I conceive of Nvidia is, it's like the iPhone only on the enterprise side. Nvidia's sales will therefore be tied more to the business cycle than the consumer cycle; and this could mean more of a cyclical business that Apple ever achieved. However, it is also a much differenct business model than Cisco. It's almost like Apple's business model with Cisco's market segment.
- There are a number of competitors for hardware, from Microsoft, Meta, Google, and Amazon. Those are mostly serving internal needs today, but Amazon and Google tend to productize their internal products. Other players are also selling chips - the author mentions Broadcom.
- Both Google and Apple are doing their best to shift inference costs to the consumer. While this seems expensive right now, it will last at most a couple of years. Once RAM production catches up, this will take pressure off cloud inference.
NVIDIA will probably hit 8T before it shrinks. But it's in a hypercompetitive market now in a way it wasn't when LLMs weren't yet driving demand for its hardware.
The author made compelling points regarding capex cycles and supply/demand imbalances. So how does NVDA continue to deliver these returns over the next 10 years? Are the 4 firms driving >60% of NVDA's revenue going to maintain that high capex?
All it will take is a market shock that forces just one of these companies to pivot modes from spending to cost cutting for NVDA's bottom line to see material compression. This is not to say that NVDA will disappear, but it's a very real risk that sets the stage for contagion given how coupled all this growth is.
Arguments against history require more reasoning. Patterns emerge because that's how the systems work. "This time it will be different" require justification about why the system has significantly changed. The article makes a strong case for why the system hasn't changed.
One point she makes is that these chips don't last forever and must be replaced every couple years at least. That there is a cycle. Nvidia is, imo, trying to replicate the iphione model, only with chips for enterprise rather than smart phones for consumers. For example, Nvidia went from a 2-year upgrade cycle to a 1 year, just like iPhones started as an every-other year upgrade cycle (with an S transitionary year) to annual upgrades. In fact, the iPhone gets at least a few new models each year, including regular, pro, and pro max. Nvidia is also trying to turn Cuda into the App Store of their platform. There's even the premium market parallel, ie, iPhone and Nvidia GPUs are the most expensive of their ilk. The way I conceive of Nvidia is, it's like the iPhone only on the enterprise side. Nvidia's sales will therefore be tied more to the business cycle than the consumer cycle; and this could mean more of a cyclical business that Apple ever achieved. However, it is also a much differenct business model than Cisco. It's almost like Apple's business model with Cisco's market segment.
I hate the AI writing but this is probably right.
- There are a number of competitors for hardware, from Microsoft, Meta, Google, and Amazon. Those are mostly serving internal needs today, but Amazon and Google tend to productize their internal products. Other players are also selling chips - the author mentions Broadcom.
- Both Google and Apple are doing their best to shift inference costs to the consumer. While this seems expensive right now, it will last at most a couple of years. Once RAM production catches up, this will take pressure off cloud inference.
NVIDIA will probably hit 8T before it shrinks. But it's in a hypercompetitive market now in a way it wasn't when LLMs weren't yet driving demand for its hardware.
Arguments like this require more reasoning than “because history”.
What's different this time around?
The author made compelling points regarding capex cycles and supply/demand imbalances. So how does NVDA continue to deliver these returns over the next 10 years? Are the 4 firms driving >60% of NVDA's revenue going to maintain that high capex?
All it will take is a market shock that forces just one of these companies to pivot modes from spending to cost cutting for NVDA's bottom line to see material compression. This is not to say that NVDA will disappear, but it's a very real risk that sets the stage for contagion given how coupled all this growth is.
Arguments against history require more reasoning. Patterns emerge because that's how the systems work. "This time it will be different" require justification about why the system has significantly changed. The article makes a strong case for why the system hasn't changed.
Let's hear an apple & google take too once they hit 5T