"However, this is not just a cyclical shortage driven by a mismatch in supply and demand, but a potentially permanent, strategic reallocation of the world’s silicon wafer capacity. [...] This is a zero-sum game: every wafer allocated to an HBM stack for an Nvidia GPU is a wafer denied to the LPDDR5X module of a mid-range smartphone or the SSD of a consumer laptop."
I wonder if this will result in writing more memory-efficient software? The trend for the last couple of decades has been that nearly all consumer software outside of gaming has moved to browsers or browser-based runtimes like Electron. There's been a vicious cycle of heavier software -> more RAM -> heavier software but if this RAM shortage is permanent, the cycle can't continue.
Apple and Google seemed to be working on local AI models as well. Will they have to scale that back due to lack of RAM on the devices? Or perhaps they think users will pay the premium for more RAM if it means they get AI?
Or is this all a temporary problem due to OpenAI's buying something like 40% of the wafers?
> I wonder if this will result in writing more memory-efficient software?
If the consumer market can't get cheap RAM anymore, the natural result is a pivot back to server-heavy technology (where all the RAM is anyway) with things like server-side rendering and thin clients. Developers are far too lazy to suddenly become efficient programmers and there's plenty of network bandwidth.
There's plenty of scope for local AI models to become more efficient, too. MoE doesn't need too much RAM: only the parameters for experts that are active at any given time truly need to be in memory, the rest can be in read-only storage and be fetched on demand. If you're doing CPU inference this can even be managed automatically by mmap, whereas loading params into VRAM must currently be managed as part of running an inference step. (This is where GPU drivers/shader languages/programming models could also see some improvement, TBH)
Nice assertion. Perhaps you meant that AI could be directed towards less memory intensive implementations. That would still have to be directed by those same lazy/poor coders because the code the AI is learning from is their bad code (for the most part).
IDK, given the prevalence of Electron and other technically-correct-but-inefficient code out there, at bare minimum it would require decent prompting to help.
If industry has a bit of fear that the demand will slow down by the time they can output meaningful amount of chips, then probably not. Time will show.
> There's been a vicious cycle of heavier software -> more RAM -> heavier software but if this RAM shortage is permanent, the cycle can't continue.
What do you mean it can't continue? You'll just have to deal with worse performance is all.
Revolutionary consumer-side performance gains like multi-core CPUs and switching to SSDs will be a thing of distant past. Enjoy your 2 second animations, peasant.
I don't really get the panic. This is the same as the pandemic, just for different reasons. A change in demand is causing supply shortages and price hikes. But the demand will eventually swing back as the current demand is completely unsustainable. AI demand will crumble, prices will bottom out, and companies who bet big on AI & RAM will end up going into big layoffs triggering another recession and a huge market crash. We literally just went through all this, it can't be a surprise.
On the AI correction point: given the K-shaped economy, we're headed for a mean reversion either way. Either the bottom corrects upward or the top corrects down. Real life being what it is, I'm bracing for the latter. Though I'd love to be proven wrong
One of the things I’ve been hoping for every time a new EC2 instance comes out is for them to unpin the memory:core ratio a bit. I don’t expect they have enough r# and c# users to completely balance things out so what they’re really doing is selling people more CPUs to get the memory they need.
It would be nice if it were creeping up generation to generation. But if this keeps up I fear the opposite.
For many ephemeral workloads, sure, but that comes at the expense of generally worse and less consistent CPU performance.
There are plenty of workloads where I’d love to double the memory and halve the cores compared to what the memory-optimised R instances offer, or where I could further double the cores and halve the RAM from what the compute-optimised C instances can do.
“Serverless” options can provide that to an extent, but it’s no free lunch, especially in situations where performance is a large consideration. I’ve found some use cases where it was better to avoid AWS entirely and opt for dedicated options elsewhere. AWS is remarkably uncompetitive in some use cases.
kv stores also exist because for many generations of tooling it was faster to manage read-mostly data off-heap instead of on, and that becomes more true the more processes you run doing jobs that touch the same data.
You should just get used to it because the memory per core is going down inexorably forever until someone makes a physics breakthrough. We know how to print cores and the core count is going to keep going up.
We’ll see. The current situation is an anomaly caused in part by the AI boom generally, in part by the OpenAI shenanigans. This time will pass, and what comes after is hard to know. If the shortages are sustained long enough, innovation will happen. The economics of it will bring new players into the market.
DRAM is a notoriously cyclical market, though, and wise investors are leery of jumping into a frothy top. So, it’ll take a while before anyone decides the price is right to stand up a new competitor.
How is it an opportunity for Apple? They are a customer of Samsung and Micron RAM modules just like everyone else is. They aren't in any unique position other than their user base is already used to paying extreme markup for RAM. Now whether Apple just eats the cost in their profit margin or charges even more for RAM remains to be seen.
Their supply chain prowess likely means they have already secured contracts for 2026 (and maybe even 2027), so they will not be affected by the price hike. But maybe they'll still use it as an opportunity to bump prices and rake in free profit, who knows.
> if Apple smartphone specs improve while Androids stagnate it could create more iOS user
Nah. The marginal utility of more smartphone ram is near zero at this point. The vast majority of people wouldn't even notice if the memory in their phone tripled overnight.
Basically if I have to start comparing iPhone specs to Android phone specs I might aswell just buy an Android. The point of iOS is that you don't have to.
The competitive advantage comes from Apple having the supply chain contracts in place to not be affected by the 2026 price hike as much. The Android phones will be more expensive and thus will capture less market share.
I agree with you and have agreed with you for a long time. However, I definitely see the writing on the wall. More than one person in my circle have traditionally been Android users and the lack of innovation from both Apple and Android have them comparing devices on specs MUCH more. I include myself in this list on my next upgrade. I'll be looking largely at specs on the next upgrad because honestly there's not much day to day difference in usage between apple and android anymore
> potential contraction in the global smartphone market alongside an increase in average selling prices (ASP). In 2026, in our moderate downside scenario, we could see the market contract by 2.9%. In our pessimistic downside scenario, it could be as bad as 5.2%.
> PC market contract by 4.9% compared with a 2.4% year-on-year decline in the November forecast. Under a more pessimistic scenario, the decline could deepen to 8.9%.
Article completely misses the true cause of the price increase - Sam Altman/OAI made a deal with Samsung and SK Hynix get 40% of their RAM wafer production for the 2026 period. This was economic warfare against OpenAI's competitors, and the competitors along with the data centers responded by buying up every bit of DDR5 in sight. This price increase was engineered.
The deal was inked on October 1, 2025, and rumors of it started swirling in September. Take a look at the RAM price charts. Anyone who attributes this just to "AI growth" has no idea what they're talking about. AI has been growing rapidly for three years and yet this price increase just happened exactly when Altman signed this deal.
It's also worth noting that IDC, who published this report, is wholly owned by Blackstone, who is also heavily invested in OpenAI. It would be prudent to be cautious about who you believe.
With a functional government, antitrust enforcement would prevent a single company from driving economy-wide price inflation out of an attempt to starve its competition. Since we don't have a functional government, we'll ungracefully take this up the ass.
Sure thing. Are they? And also, why would they do that? Do you think OpenAI wants to enter into the DRAM manufacturing business? Or were they looking for a way to take as much supply away as possible - paying for the wafers instead of finished DRAM?
Doesn't Apple routinely do the same thing? Reserve chip production for the leading-edge nodes, and sometimes enter into similar deals for other tech such as displays? I'm not seeing any evidence that this was intentional "warfare" on OpenAI's part: they're just making a high-stakes bet that they can ultimately find a better and higher-margin use for that raw DRAM than HNers' gaming battlestations, or whatever the next-best use was when they made that deal.
No, apple does not buy production capacity to prevent others from using it. They buy it to use it themselves.
The wafers are not DRAM. This is more likely burning oil wells so your enemy can't use them. Wafers are to chips what steel blanks are to engines. You basically need clean rooms just to accept delivery and entire fabs to do anything. Someone who doesn't own a fab buying the wafers is essentially buying them to destroy them.
Not sure that OpenAI's move was a very good one, they've just created a lot of enemies for themselves. I see comments all over the internet about AI slop making RAM expensive. It's going to eat into the profits of a lot of companies. People will be willing for this insanity to end.
Imagine a future where a resourceful computer will be unobtanium, because AI companies decided to outbid consumers. Your PC will be just powerful enough to work as a terminal, with all the heavy lifting done by cloud compute data centers.
Every functionality be will subscription-based. You'll own nothing and you'll be happy.
You are absolutely correct that much of modern software is extremely bloated. So maybe a silver lining is that less memory for the next couple of years will also force developers to produce leaner software. Electron apps being the first in line.
I'm struggling to picture how we get there from here? There's a huge pile of second hand PCs available, and almost all of them are massively more powerful than necessary for terminal purposes only.
I mean this is one of the risk factors in AI safety that's been communicated for a long time. It's not just computing, but potentially everything. Energy resources, land resources (like those used to grow food for us meat bags), transportation resources. Suddenly humans find themselves outbid by AI as AI has pushed us out of the economy.
The economy says nothing about requiring humans to exist.
Luckily Beelinks are still cheap, work decently, and can run Linux/Windows, so if all someone needs is to browse the Internet and do basic stuff, honestly? They’re fine. We’ll see how long that lasts though.
As a millenial all I see is my generation being repulsed by AI slop. Boomers and zoomers though have a large presence of consumption. It was easy to see this with your own family over the holidays.
They will tell you they are repulsed by it if asked but its a toss up if they can identify it. Look at any thread on Reddit/IG/Tiktok whatever and I personally would guess I could manage to identify AI output 20% of the time.
Boomers might be out there consuming those AI youtube videos that are just tiktok voice over with a generated slide show but Millennials think since they can identify this as slop that they are not affected. That is incorrect, and just as bad.
I'm not blaming them. It's really frustrating that old people are taken advantage of. We shouldn't need to be so cynical. This isn't the star trek future we were promised.
Edit: It's similarly frustrating about the zoomers. Parents are derelict of duty by not defending their kids and preparing them for the world they are in.
Sci-fi will never materialize. But the ones passionate about it are so desperate for the faux future that they won't be able to tell when they're being duped.
Just wait until the next great collapse, a disaster big enough to force change. Hopefully we'll have the right ideas lying around at the time to restructure our social communication system.
Nah, it's fueled by huge misinformation campaigns. It's going to kill art, put us all out of the job, uses 1.5 million gallons per query, pollutes water, will kill the electric grid, etc. These seem to be the most popular uninformed lines of thinking.
Ya we seem to live in the the place where the firehose of falsehood is filling the lake of bullshit asymmetry. The problem with this is uninformed lines of thinking eventually lead to policy.
I agree several of the commonly repeated critiques are really poor in quality and can be emotionally driven/simply parroted TikTok nonsense, but at the other end of the spectrum we have AI evangelists who get surprisingly aggressive if you say anything remotely negative about GenAI or suggest maybe we should be having a discussion about the ethical ramifications of these tools. Particularly how they are trained and deployed and who should be guiding that process.
I find it very odd when people proudly proclaim they used, say, Grok to answer a question. Their identity is so tied up in it that if you start talking about the quality of the information they get incredibly defensive. In contrast: I have never felt protective of my Google search results, which is basically the same thing given how most people use these tools currently.
It’s kind of wild how hostile some people get if you attempt to open the discussion up at all.
I live near the Great Lakes. Data center proposals are popping up and people think they are going to drain lake michigan. They think they will consume more power than the entire state consumes right now. Idiot yokels are chasing away what could be an absolute boon for our economy. But they'd rather have papermills and make cardboard boxes.
I don’t know the specifics of your region’s deal but the massive AI center deal Louisiana negotiated with facebook is absolutely awful. All it’s going to do is drive up energy costs for the residents and give very little in return, and that’s under the ideal situation in which it actually pans out like they’re expecting it to.
I always thought it was more about differences in productivity between sectors. If the Baumol effect made service sector wages increase, would wouldn't ineffencies do the inverse?
It sounds like effenciencies in manufacturing sectors lead to more costly medical services, and ineffencies in manufacturing sectors lead to costly medical services. Am I understanding correctly?
William J. Baumol the namesake of the Baumol Effect[1]. Generally it is
> the tendency for wages in jobs that have experienced little or no increase in labor productivity to rise in response to rising wages in other jobs that did experience high productivity growth
Specifically, manufacturing sectors have increased productivity and service sectors haven't.
"However, this is not just a cyclical shortage driven by a mismatch in supply and demand, but a potentially permanent, strategic reallocation of the world’s silicon wafer capacity. [...] This is a zero-sum game: every wafer allocated to an HBM stack for an Nvidia GPU is a wafer denied to the LPDDR5X module of a mid-range smartphone or the SSD of a consumer laptop."
I wonder if this will result in writing more memory-efficient software? The trend for the last couple of decades has been that nearly all consumer software outside of gaming has moved to browsers or browser-based runtimes like Electron. There's been a vicious cycle of heavier software -> more RAM -> heavier software but if this RAM shortage is permanent, the cycle can't continue.
Apple and Google seemed to be working on local AI models as well. Will they have to scale that back due to lack of RAM on the devices? Or perhaps they think users will pay the premium for more RAM if it means they get AI?
Or is this all a temporary problem due to OpenAI's buying something like 40% of the wafers?
> I wonder if this will result in writing more memory-efficient software?
If the consumer market can't get cheap RAM anymore, the natural result is a pivot back to server-heavy technology (where all the RAM is anyway) with things like server-side rendering and thin clients. Developers are far too lazy to suddenly become efficient programmers and there's plenty of network bandwidth.
Developers would prefer to write good software, the challenge and the craftsmanship are a draw.
However, the customers do not care and will not pay more so the business cannot justify it most of the time.
Who will pay twice (or five times) as much for software written in C instead of Python? Not many.
This is by design. Rent your computer.. don't buy! Use Geforce Now!
There's plenty of scope for local AI models to become more efficient, too. MoE doesn't need too much RAM: only the parameters for experts that are active at any given time truly need to be in memory, the rest can be in read-only storage and be fetched on demand. If you're doing CPU inference this can even be managed automatically by mmap, whereas loading params into VRAM must currently be managed as part of running an inference step. (This is where GPU drivers/shader languages/programming models could also see some improvement, TBH)
But aren't the experts chosen on a token by token basis, which means bandwidth limitations?
This is a temporary problem driven by the AI bubble. It's going to hurt until the bubble pops, but when that happens other things are going to hurt
Code the AI will produce will solve the memory usage problems which is itself a result of lazy or poor human coders.
Nice assertion. Perhaps you meant that AI could be directed towards less memory intensive implementations. That would still have to be directed by those same lazy/poor coders because the code the AI is learning from is their bad code (for the most part).
IDK, given the prevalence of Electron and other technically-correct-but-inefficient code out there, at bare minimum it would require decent prompting to help.
It's not a zero sum game because silicon wafers are not a finite resource. Industry can and will produce more.
If industry has a bit of fear that the demand will slow down by the time they can output meaningful amount of chips, then probably not. Time will show.
Neither are paperclips.
> There's been a vicious cycle of heavier software -> more RAM -> heavier software but if this RAM shortage is permanent, the cycle can't continue.
What do you mean it can't continue? You'll just have to deal with worse performance is all.
Revolutionary consumer-side performance gains like multi-core CPUs and switching to SSDs will be a thing of distant past. Enjoy your 2 second animations, peasant.
I don't really get the panic. This is the same as the pandemic, just for different reasons. A change in demand is causing supply shortages and price hikes. But the demand will eventually swing back as the current demand is completely unsustainable. AI demand will crumble, prices will bottom out, and companies who bet big on AI & RAM will end up going into big layoffs triggering another recession and a huge market crash. We literally just went through all this, it can't be a surprise.
On the AI correction point: given the K-shaped economy, we're headed for a mean reversion either way. Either the bottom corrects upward or the top corrects down. Real life being what it is, I'm bracing for the latter. Though I'd love to be proven wrong
One of the things I’ve been hoping for every time a new EC2 instance comes out is for them to unpin the memory:core ratio a bit. I don’t expect they have enough r# and c# users to completely balance things out so what they’re really doing is selling people more CPUs to get the memory they need.
It would be nice if it were creeping up generation to generation. But if this keeps up I fear the opposite.
The best way of "unpinning" those ratios for many ephemeral workloads is to use Lambda/FaaS, not EC2.
For many ephemeral workloads, sure, but that comes at the expense of generally worse and less consistent CPU performance.
There are plenty of workloads where I’d love to double the memory and halve the cores compared to what the memory-optimised R instances offer, or where I could further double the cores and halve the RAM from what the compute-optimised C instances can do.
“Serverless” options can provide that to an extent, but it’s no free lunch, especially in situations where performance is a large consideration. I’ve found some use cases where it was better to avoid AWS entirely and opt for dedicated options elsewhere. AWS is remarkably uncompetitive in some use cases.
kv stores also exist because for many generations of tooling it was faster to manage read-mostly data off-heap instead of on, and that becomes more true the more processes you run doing jobs that touch the same data.
You should just get used to it because the memory per core is going down inexorably forever until someone makes a physics breakthrough. We know how to print cores and the core count is going to keep going up.
We knew how to print memory long before we knew how to print cores.
We’ll see. The current situation is an anomaly caused in part by the AI boom generally, in part by the OpenAI shenanigans. This time will pass, and what comes after is hard to know. If the shortages are sustained long enough, innovation will happen. The economics of it will bring new players into the market.
DRAM is a notoriously cyclical market, though, and wise investors are leery of jumping into a frothy top. So, it’ll take a while before anyone decides the price is right to stand up a new competitor.
I see this as a competitive opportunity for Apple. If Apple smartphone specs improve while Androids stagnate it could create more iOS users.
The promised AI metaverse is still a long way off and in the meantime people still want the best smartphone.
How is it an opportunity for Apple? They are a customer of Samsung and Micron RAM modules just like everyone else is. They aren't in any unique position other than their user base is already used to paying extreme markup for RAM. Now whether Apple just eats the cost in their profit margin or charges even more for RAM remains to be seen.
Their supply chain prowess likely means they have already secured contracts for 2026 (and maybe even 2027), so they will not be affected by the price hike. But maybe they'll still use it as an opportunity to bump prices and rake in free profit, who knows.
Apple reportedly negotiated a 50% price increase in memory supply for 2026, down from a higher price given to other OEMs.
> if Apple smartphone specs improve while Androids stagnate it could create more iOS user
Nah. The marginal utility of more smartphone ram is near zero at this point. The vast majority of people wouldn't even notice if the memory in their phone tripled overnight.
I have no idea how much RAM my phone has.
And if you think that somebody buys an iPhone because they compare the specs with Android :)))))
Basically if I have to start comparing iPhone specs to Android phone specs I might aswell just buy an Android. The point of iOS is that you don't have to.
The competitive advantage comes from Apple having the supply chain contracts in place to not be affected by the 2026 price hike as much. The Android phones will be more expensive and thus will capture less market share.
iPhone being expensive is a feature.
"What do you mean my status flagship iPhone costs only half as much as a flagship Android???"
I agree with you and have agreed with you for a long time. However, I definitely see the writing on the wall. More than one person in my circle have traditionally been Android users and the lack of innovation from both Apple and Android have them comparing devices on specs MUCH more. I include myself in this list on my next upgrade. I'll be looking largely at specs on the next upgrad because honestly there's not much day to day difference in usage between apple and android anymore
Most regular people don’t care about RAM specs. And lately it’s Apple that has been rather stagnating in terms of features.
What, because they aren't shoving an LLM in every orifice of their product?
I would love stagnation, the keyboard has always been a dumpster fire but these days it is an actively regressing dumpster fire
> potential contraction in the global smartphone market alongside an increase in average selling prices (ASP). In 2026, in our moderate downside scenario, we could see the market contract by 2.9%. In our pessimistic downside scenario, it could be as bad as 5.2%.
> PC market contract by 4.9% compared with a 2.4% year-on-year decline in the November forecast. Under a more pessimistic scenario, the decline could deepen to 8.9%.
These companies can probably sense that China is catching up, and are trying squeeze out as much juice as possible before that happens.
Imagine if China comes to the rescue and supplies the world with affordable RAM and open sources the technology, like how they did with DeepSeek.
Article completely misses the true cause of the price increase - Sam Altman/OAI made a deal with Samsung and SK Hynix get 40% of their RAM wafer production for the 2026 period. This was economic warfare against OpenAI's competitors, and the competitors along with the data centers responded by buying up every bit of DDR5 in sight. This price increase was engineered.
The deal was inked on October 1, 2025, and rumors of it started swirling in September. Take a look at the RAM price charts. Anyone who attributes this just to "AI growth" has no idea what they're talking about. AI has been growing rapidly for three years and yet this price increase just happened exactly when Altman signed this deal.
https://pcpartpicker.com/trends/price/memory/
It's also worth noting that IDC, who published this report, is wholly owned by Blackstone, who is also heavily invested in OpenAI. It would be prudent to be cautious about who you believe.
Right! Raw wafers, not even memory. I have seen no evidence that this move was anything but a means of taking product out of the global supply chain.
That's correct and a good point I almost forgot about - they can't even utilize what they bought!
It feels like abuse. They shouldn't be able to get away with such trickery.
With a functional government, antitrust enforcement would prevent a single company from driving economy-wide price inflation out of an attempt to starve its competition. Since we don't have a functional government, we'll ungracefully take this up the ass.
stockpiling solely in order to deprive your competitors of a commodity is anti-competitive and illegal
You do know that they can hire semiconductor packaging companies to put together memory modules the same way they bought the DRAM wafers, right?
Sure thing. Are they? And also, why would they do that? Do you think OpenAI wants to enter into the DRAM manufacturing business? Or were they looking for a way to take as much supply away as possible - paying for the wafers instead of finished DRAM?
I hadn't heard of this, so I searched the web (without any help from an LLM, btw) and found this:
https://www.mooreslawisdead.com/post/sam-altman-s-dirty-dram...
After the dns entry, the stockpile of ram may be the most valuable asset that company has.
Doesn't Apple routinely do the same thing? Reserve chip production for the leading-edge nodes, and sometimes enter into similar deals for other tech such as displays? I'm not seeing any evidence that this was intentional "warfare" on OpenAI's part: they're just making a high-stakes bet that they can ultimately find a better and higher-margin use for that raw DRAM than HNers' gaming battlestations, or whatever the next-best use was when they made that deal.
No, apple does not buy production capacity to prevent others from using it. They buy it to use it themselves.
The wafers are not DRAM. This is more likely burning oil wells so your enemy can't use them. Wafers are to chips what steel blanks are to engines. You basically need clean rooms just to accept delivery and entire fabs to do anything. Someone who doesn't own a fab buying the wafers is essentially buying them to destroy them.
Not sure that OpenAI's move was a very good one, they've just created a lot of enemies for themselves. I see comments all over the internet about AI slop making RAM expensive. It's going to eat into the profits of a lot of companies. People will be willing for this insanity to end.
We can only hope you are right
Imagine a future where a resourceful computer will be unobtanium, because AI companies decided to outbid consumers. Your PC will be just powerful enough to work as a terminal, with all the heavy lifting done by cloud compute data centers.
Every functionality be will subscription-based. You'll own nothing and you'll be happy.
“Heavy lifting” here being running 108 MB of JavaScript to render shopping cart of an online store, correct?
Or consuming 2 GB of RAM to have Teams running in the background doing nothing?
Yeah, if we got rid of that as a result of RAM shortages, that’d be great.
You are absolutely correct that much of modern software is extremely bloated. So maybe a silver lining is that less memory for the next couple of years will also force developers to produce leaner software. Electron apps being the first in line.
I'm struggling to picture how we get there from here? There's a huge pile of second hand PCs available, and almost all of them are massively more powerful than necessary for terminal purposes only.
I mean this is one of the risk factors in AI safety that's been communicated for a long time. It's not just computing, but potentially everything. Energy resources, land resources (like those used to grow food for us meat bags), transportation resources. Suddenly humans find themselves outbid by AI as AI has pushed us out of the economy.
The economy says nothing about requiring humans to exist.
That is the logical conclusion. The era of personal computers is coming to an end. It had a good run though.
Luckily Beelinks are still cheap, work decently, and can run Linux/Windows, so if all someone needs is to browse the Internet and do basic stuff, honestly? They’re fine. We’ll see how long that lasts though.
All so people in developing countries can churn out boomer-baiting slop for social media engagement farming and ad views.
What makes you think it’s limited to boomers.. know of just as many millennials that eat the stuff up too.
As a millenial all I see is my generation being repulsed by AI slop. Boomers and zoomers though have a large presence of consumption. It was easy to see this with your own family over the holidays.
They will tell you they are repulsed by it if asked but its a toss up if they can identify it. Look at any thread on Reddit/IG/Tiktok whatever and I personally would guess I could manage to identify AI output 20% of the time.
Boomers might be out there consuming those AI youtube videos that are just tiktok voice over with a generated slide show but Millennials think since they can identify this as slop that they are not affected. That is incorrect, and just as bad.
We musn't be unkind to the boomers. When we're their age, the methods for assaulting our poor old brains will be ever so more sophisticated.
I'm not blaming them. It's really frustrating that old people are taken advantage of. We shouldn't need to be so cynical. This isn't the star trek future we were promised.
Edit: It's similarly frustrating about the zoomers. Parents are derelict of duty by not defending their kids and preparing them for the world they are in.
> This isn't the star trek future we were promised.
It is, though. We're just in the part leading up to WWIII.
Yeah, that bit of the Star Trek Universe is something not many folks know about.
You want to be born into the utopia, not before.
Sci-fi will never materialize. But the ones passionate about it are so desperate for the faux future that they won't be able to tell when they're being duped.
Just wait until the next great collapse, a disaster big enough to force change. Hopefully we'll have the right ideas lying around at the time to restructure our social communication system.
Until then, it's slow decline. Embrace it.
Sci-fi has materialized, we're living in it now. The problem is it's the dystopia edition.
As someone succinctly put it: The future is here, it is just very unevenly distributed.
https://yougov.com/en-us/articles/50754-how-different-genera...
What do ads have to do with AI slop?
“All so people in developing countries can churn out boomer-baiting slop for social media engagement farming and ad views.”
https://news.ycombinator.com/item?id=46413716
I think your sample of Millenials is probably more well informed.
Nah, it's fueled by huge misinformation campaigns. It's going to kill art, put us all out of the job, uses 1.5 million gallons per query, pollutes water, will kill the electric grid, etc. These seem to be the most popular uninformed lines of thinking.
We have always been at war with Eastasia.
Ya we seem to live in the the place where the firehose of falsehood is filling the lake of bullshit asymmetry. The problem with this is uninformed lines of thinking eventually lead to policy.
I agree several of the commonly repeated critiques are really poor in quality and can be emotionally driven/simply parroted TikTok nonsense, but at the other end of the spectrum we have AI evangelists who get surprisingly aggressive if you say anything remotely negative about GenAI or suggest maybe we should be having a discussion about the ethical ramifications of these tools. Particularly how they are trained and deployed and who should be guiding that process.
I find it very odd when people proudly proclaim they used, say, Grok to answer a question. Their identity is so tied up in it that if you start talking about the quality of the information they get incredibly defensive. In contrast: I have never felt protective of my Google search results, which is basically the same thing given how most people use these tools currently.
It’s kind of wild how hostile some people get if you attempt to open the discussion up at all.
I live near the Great Lakes. Data center proposals are popping up and people think they are going to drain lake michigan. They think they will consume more power than the entire state consumes right now. Idiot yokels are chasing away what could be an absolute boon for our economy. But they'd rather have papermills and make cardboard boxes.
I don’t know the specifics of your region’s deal but the massive AI center deal Louisiana negotiated with facebook is absolutely awful. All it’s going to do is drive up energy costs for the residents and give very little in return, and that’s under the ideal situation in which it actually pans out like they’re expecting it to.
They also don’t care about the communities they are impacting in the slightest. https://lailluminator.com/2025/11/22/meta-data-center-crashe...
I welcome this. Anything that makes consumer electronics more expensive acts counter to the Baumol Effect.
How scarce does memory have to get before it makes health care half as expensive?
Where do you think the IT costs for doctor's workstations are going to be redirected?
I always thought it was more about differences in productivity between sectors. If the Baumol effect made service sector wages increase, would wouldn't ineffencies do the inverse?
It makes all electronics more expensive. This makes every service more expensive as well.
It sounds like effenciencies in manufacturing sectors lead to more costly medical services, and ineffencies in manufacturing sectors lead to costly medical services. Am I understanding correctly?
Who said efficiencies in manufacturing sectors lead to more costly medical services?
William J. Baumol the namesake of the Baumol Effect[1]. Generally it is
> the tendency for wages in jobs that have experienced little or no increase in labor productivity to rise in response to rising wages in other jobs that did experience high productivity growth
Specifically, manufacturing sectors have increased productivity and service sectors haven't.
1. https://en.wikipedia.org/wiki/Baumol_effect