624 comments

  • ComputerGuru 4 days ago ago

    Framing it in gigawatts is very interesting given the controversy about skyrocketing electric prices for residential and small business users as a result of datacenters over the past three years, primarily driven by AI growth. If, as another commenter notes, this 10GW is how much Chicago and NYC use combined, then we need to have a serious discussion about where this power is going to come from given the dismal status of the USA's power grid and related infrastructure and the already exploding costs that have been shifted to residential users in order to guarantee electric supply to these biggest datacenters (so they can keep paying peanuts for electricity and avoid shouldering any of the infrastructural burden to maintain or improve the underlying grid/plants required to meet their massive power needs).

    I'm not even anti-datacenter (wouldn't be here if I were), I just think there needs to be serious rebalancing of these costs because this increase in US residential electric prices in just five years (from 13¢ to 19¢, a ridiculous 46% increase) is neither fair nor sustainable.

    So where is this 10GW electric supply going to come from and who is going to pay for it?

    Source: https://fred.stlouisfed.org/series/APU000072610

    EDIT:

    To everyone arguing this is how DCs are normally sized: yes, but normally it's not the company providing the compute for the DC owner that is giving these numbers. nVidia doesn't sell empty datacenters with power distribution networks, cooling, and little else; nVidia sells the GPUs that will stock that DC. This isn't a typical PR netnewswire bulletin "OpenAI announces new 10GW datacenter", this is "nvidia is providing xx compute for OpenAI". Anyway, all this is a segue from the question of power supply, consumption, grid expansion/stability, and who is paying for all that.

    • elbasti 4 days ago ago

      I work in the datacenter space. The power consumption of a data center is the "canonical" way to describe their size.

      Almost every component in a datacenter is upgradeable—in fact, the compute itself only has a lifespan of ~5 years—but the power requirements are basically locked-in. A 200MW data center will always be a 200MW data center, even though the flops it computes will increase.

      The fact that we use this unit really nails the fact that AI is basically refining energy.

      • aurareturn 4 days ago ago

          A 200MW data center will always be a 200MW data center, even though the flops it computes will increase.
        
        This here underscores how important TSMC's upcoming N2 node is. It only increases chip density by ~1.15x (very small relative to previous nodes advancements) but it uses 36% less energy at the same speed as N3 or 18% faster than N3 at the same energy. It's coming at the right time for AI chips used by consumers and energy starved data centers.

        N2 is shaping up to be TSMC's most important node since N7.

        • alberth 4 days ago ago

          > N2 is shaping up to be TSMC's most important node since N7

          Is it?

          N2, from an energy & perf improvement seems on par with any generation node update.

                    N2:N3   N3:N5  N5:N7
            Power   ~30%    ~30%    ~30%
            Perf    ~15%    ~15%    ~15%
          
          https://www.tomshardware.com/news/tsmc-reveals-2nm-fabricati...
          • aurareturn 4 days ago ago

            Yes. It has more tape outs at this stage of development than both N5 or N3. It’s wildly popular for chip designers it seems.

            • alberth 3 days ago ago

              I thought Apple gets exclusive access to the latest node for the first 1-2 years. Is that not the case?

              • aurareturn 3 days ago ago

                No. That's not the case. Maybe for a few months only.

                • alberth 3 days ago ago

                  Correct me if I'm wrong but didn't TSMC launch N3 in 2022, and still only Apple uses this latest/smallest node.

                  Both AMD and NVIDIA are using N4.

                  • aurareturn 2 days ago ago

                    Apple, Mediatek, Qualcomm, Intel

      • pseudosavant 4 days ago ago

        I love that term "refining energy". We need to plan for massive growth in electricity production to have the supply to refine.

        • tmalsburg2 4 days ago ago

          Sounds smart but it’s abusing the semantics of “refine” and is therefore ultimately vacuous.

          • pseudosavant 3 days ago ago

            I think it is really just the difference between chemically refining something and electrically refining something.

            Raw AC comes in, then gets stepped down, filtered, converted into DC rails, gated, timed, and pulsed. That’s already an industrial refinement process. The "crude" incoming power is shaped into the precise, stable forms that CPUs, GPUs, RAM, storage, and networking can actually use.

            Then those stable voltages get flipped billions of times per second into ordered states, which become instructions, models, inferences, and other high-value "product."

            It sure seems like series of processes for refining something.

      • jacquesm 4 days ago ago

        It is the opposite of refining energy. Electrical energy is steak, what leaves the datacenter is heat, the lowest form of energy that we might still have a use for in that concentration (but most likely we are just dumping it in the atmosphere).

        Refining is taking a lower quality energy source and turning it into a higher quality one.

        What you could argue is that it adds value to bits. But the bits themselves, their state is what matters, not the energy that transports them.

        • elbasti 4 days ago ago

          I think you're pushing the metaphor a bit far, but the parallel was to something like ore.

          A power plant "mines" electron, which the data center then refines into words. or whatever. The point is that energy is the raw material that flows into data centers.

          • fuzzfactor 4 days ago ago

            Maybe more like converting energy to data, as a more specific type of refinement.

            • phkahler 4 days ago ago

              Using energy to decrease the entropy of data. Or to organize and structure data.

              • LaGrange 3 days ago ago

                This is OpenAI, they are not decreasing the entropy. This is refining coal into waste heat and CO2.

              • fuzzfactor 4 days ago ago

                I like that. Take random wild electrons and put them neatly into rows & columns where they can sit a spell.

      • reubenmorais 4 days ago ago

        All life is basically refining energy - standing up to entropy and temporarily winning the fight.

        • HPsquared 4 days ago ago

          It's all about putting the entropy somewhere else and keeping your own little area organised.

          • xnickb 4 days ago ago

            People of the earth, remember: unnecessary arm and leg movements increase the entropy! Fear of the heat death of the universe! Lie down when possible!

        • antihipocrat 4 days ago ago

          Yes, in a very local context it appears so, but net entropy across the system from life's activities is increased

        • ithkuil 4 days ago ago

          "the purpose of life is to hydrogenate carbon dioxide"

          -- Michael Russel

      • casey2 3 days ago ago

        Where do the cards go after 5 years? I don't see a large surplus of mid sized cloud providers coming to buy them (cause AI isn't profitable), Maybe other countries (possibly illegally)? Flood the consumer market with cards they can't use? TSMCs' more than doubled packaging and they are planning on doubling again

      • protocolture 4 days ago ago

        This.

        A local to me ~40W datacenter used to be in really high demand, and despite having excess rack space, had no excess power. It was crazy.

        • nixass 4 days ago ago

          40W - is that ant datacenter? :)

          • protocolture 4 days ago ago

            Yeah, it was the companies pilot site, and everything about it is tiny.

            But it very quickly became the best place in town for carrier interconnection. So every carrier wanted in.

            Even when bigger local DC's went in, a lot of what they were doing was just landing virtual cross connects to the tiny one, because thats where everyone was.

            • inemesitaffia 4 days ago ago

              You lost a M or K next to your W.

              I still have an Edison bulb that consumes more power.

      • pabs3 4 days ago ago

        > the power requirements are basically locked-in

        Why is that? To do with the incoming power feed or something else?

        • brendoelfrendo 4 days ago ago

          Basically, yes. When you stand up something that big, you need to work with the local utilities to ensure they have the capacity for what you're doing. While you can ask for more power later on, if the utilities can't supply it or the grid can't transport it, you're SOL.

          • pabs3 3 days ago ago

            You could in theory supplement it with rooftop solar and batteries, especially if you can get customers who can curtail their energy use easily. Datacentres have a lot of roof space, they could at least reduce their daytime energy costs a bit. I wonder why you don't see many doing solar, do the economics not work out yet?

            • brendoelfrendo 3 days ago ago

              I'd have to do the math, but I doubt that makes sense given the amount of power these things are drawing. I've heard of DCs having on-site power generation, but it's usually in the form of diesel generators used for supplemental or emergency power. In one weird case, I heard about a DC that used on-site diesel as primary power and used the grid as backup.

            • XorNot 3 days ago ago

              Compared to their volume they absolutely do not: you get about ~1kW / m^2 of solar. Some quick googling suggests a typical DC workload would be about 50 kW / m^2, rising too 100 for AI workloads.

        • jl6 3 days ago ago

          Cooling too. A datacenter that takes 200MW in has to dissipate 200MW of heat to somewhere.

        • djtriptych 4 days ago ago

          guessing massive capital outlays and maybe irreversible site selection/preparation concerns.

      • kulahan 4 days ago ago

        That's pretty interesting. Is it just because the power channels are the most fundamental aspect of the building? I'm sorta surprised you can't rip out old cables and drop in new ones, or something to that effect, but I also know NOTHING about electricity.

        • libraryofbabel 4 days ago ago

          Not an expert, but it’s probably related to cooling. Every joule of that electricity that goes in must also leave the datacenter as heat. And the whole design of a datacenter is centered around cooling requirements.

          • vrighter 3 days ago ago

            Exactly. To add to that, I'd like to point out that when this person says every joule, he is not exaggerating (only a teeny tiny bit). The actual computation itself barely uses any energy at all.

      • pjc50 3 days ago ago

        Refining it into what? Stock prices?

    • deelowe 4 days ago ago

      DC infra is always allocated in terms of watts. From this number, everything else is extrapolated (e.g. rough IT load, cooling needed, etc).

    • epolanski 4 days ago ago

      > is neither fair nor sustainable

      That's half what I pay in Italy, I'm sure the richest country in the world will do fine.

      • FirmwareBurner 3 days ago ago

        >I'm sure the richest country in the world will do fine.

        You underestimate how addicted the US is to cheap energy and how wasteful it is at the same time.

        Remember how your lifestyle always expands to fill the available resources no matter how good you have it? Well if tomorrow they'd have to pay EU prices, the country would have a war.

        When you lived your entire life not caring about the energy bill or about saving energy, it's crippling to suddenly have scale back and be frugal even if that price would still be less than what other countries pay.

        • impjohn 2 days ago ago

          It's hard to appreciate the difference in 'abundance mentality' between the median US and EU person. It always struck me as an interesting culture difference. While both EU and US grew in prosperity post WWII, I feel the US narrative was quite on another level.

      • modo_mario 3 days ago ago

        Here in Belgium a stupid amount of that bill is hidden taxes. i kind of assume it's similar in Italy.

        • epolanski 3 days ago ago

          We import most of our energy, that's really it.

        • port11 3 days ago ago

          And the substantial increase in profits for all providers, which isn't comparable to that of our neighbours. Our disposable income in Belgium really exists to subsidise energy companies, supermarkets, and a pathetic housing market.

    • abstractwater 4 days ago ago

      > So where is this 10GW electric supply going to come from and who is going to pay for it?

      I would also like to know. It's a LOT of power to supply. Nvidia does have a ~3% stake in Applied Digital, a bitcoin miner that pivoted to AI (also a "Preferred NVIDIA Cloud Partner") with facilities in North Dakota. So they might be involved for a fraction of those 10GW, but it seems like it will be a small fraction even with all the planned expansions.

      https://www.investopedia.com/applied-digital-stock-soars-on-...

      https://ir.applieddigital.com/news-events/press-releases/det...

    • gitpusher 4 days ago ago

      > Framing it in gigawatts is very interesting given the controversy

      Exactly. When I saw the headline I assumed it would contain some sort of ambitious green energy build-out, or at least a commitment to acquire X% of the energy from renewable sources. That's the only reason I can think to brag about energy consumption

      • 7952 4 days ago ago

        Or this brings power and prestige to the country that hosts it. And it gives clout precisely because it is seemingly wasteful. Finding the energy is a problem for the civilian government who either go "drill baby drill" or throw wind/solar/nuclear at the problem.

    • paulsutter 4 days ago ago

      Datacenters need to provide their own power/storage, and connect to the grid just to trade excess energy or provide grid stability. Given the 5-7 year backlog of photovoltaic projects waiting for interconnect, the grid is kind of a dinosaur that needs to be routed around

    • mensetmanusman 4 days ago ago

      “ skyrocketing electric prices for residential and small business users as a result of datacenters over the past three years”

      This is probably naïve. Prices skyrocketed in Germany for similar reasons before AI data centers were a thing.

    • paxys 4 days ago ago

      Watt is the hottest new currency in big tech. Want to launch something big? You don't have to ask for dollars or headcount or servers or whatever else used to be the bottleneck in the past. There's plenty of all this to go around (and if not it can be easily bought). Success or failure now depends on whether you can beg and plead your way to getting a large enough kilowatt/megawatt allocation over every other team that's fighting for it. Everything is measured this way.

    • apercu 3 days ago ago

      I had my highest power bill last month in 4 years, in a month that was unseasonably cool so no AC for most of the month. Why are we as citizens without equity in these businesses subsidizing the capital class?

    • ianks 4 days ago ago

      To me, the question is less about “how do we make more energy” and more about “how do we make LLMs 100x more energy efficient.” Not saying this is an easy problem to solve, but it all seems like a stinky code smell.

      • sothatsit 4 days ago ago

        I'm pretty confident that if LLMs were made 100x more energy efficient, we would just build bigger LLMs or run more parallel inference. OpenAI's GPT-5 Pro could become the baseline, and their crazy expensive IMO model could become the Pro offering. Especially if that energy efficiency came with speedups as well (I would be surprised if it didn't). The demand for smarter models seems very strong.

    • XorNot 3 days ago ago

      This feels like a return to the moment just before Deepseek when the market was feeling all fat and confident that "more GPUs == MOAR AI". They don't understand the science, so they really want a simple figure to point to that means "this is the winner".

      Framing it in GW is just giving them what they want, even if it makes no sense.

    • apimade 4 days ago ago

      An 8% increase y/o/y is quite substantial, however keep in mind globally we experienced the 2022 fuel shock. In Australia for example we saw energy prices double that year.

      Although wholesale electricity prices show double-digit average year-on-year swings, their true long-run growth is closer to ~6% per year, slightly above wages at ~4% during the same period.

      So power has become somewhat less affordable, but still remains a small share of household income. In other words, wage growth has absorbed much of the real impact, and power prices are still a fraction of household income.

      You can make it sound shocking with statements like “In 1999, a household’s wholesale power cost was about $150 a year, in 2022, that same household would be charged more than $1,000, even as wages only grew 2.5x”, but the real impact (on average, obviously there are outliers and low income households are disproportionately impacted in areas where gov doesn’t subsidise) isn’t major.

      https://www.aer.gov.au/industry/registers/charts/annual-volu...

      • atkailash 4 days ago ago

        I wouldn’t call a $100-270 electric bill a “fraction” when it’s about 5% post tax income. I use a single light on a timer and have a small apartment

        Especially since these sorts of corporations can get tax breaks or har means of getting regulators to allow spreading the cost. Residential shouldn’t see any increase due to data centers, but they do, and will, supplement them while seeing minimal changes to infrastructure

        When people are being told to minimize air conditioning but then these big datacenters are made and aren’t told “reduce your consumption” then it doesn’t matter how big or small the electric bill is, it’s supplementing a multi billion dollar corporation’s toy

      • bushbaba 4 days ago ago

        6% YoY is much higher than the 2-3% inflation target

      • richrichardsson 3 days ago ago

        So a 6.6x increase in power bill, offset by a 2.5x wage increase has no major impact?

        I'm sure none of the other outgoings for a household saw similar increases. /s

    • mvanbaak 4 days ago ago

      0,19 per kwh. Damn man, here it is like 0,97 per kwh (Western Europe) … stop complaining

      • Rexxar 4 days ago ago

        Regulated price in France:

        - 0,1952 per kWh for uniform price.

        - 0,1635 / 0,2081 for day/nigh pricing

        - 0,1232 /... / 0,6468 for variable pricing

        https://particulier.edf.fr/content/dam/2-Actifs/Documents/Of...

        You have a very bad deal if you pay 0.97€ per kWh.

      • patrickmcnamara 4 days ago ago

        This is not true. The average in the EU is 0,287 €/kWh. I pay 0,34 €/kWh in Berlin.

        • distances 3 days ago ago

          And in Germany the price includes transmission and taxes, it's the consumer end price. You have to remember that some countries report electricity price without transmission or taxes, also in consumer context, so you need to be careful with comparisons.

    • whatever1 4 days ago ago

      DCs need to align their training cycles with the peak of renewable power generation

      • justincormack 4 days ago ago

        They are starting to include batteries so they dont have to adjust to external factors

    • cavisne 4 days ago ago

      Utilities always need to justify rate increases with the regulator.

      The bulk of cost increases come from the transition to renewable energy. You can check your local utility and see.

      It’s very easy to make a huge customer like a data center directly pay the cost needed to serve them from the grid.

      Generation of electricity is more complicated, the data centers pulling cheap power from Colombia river hydro are starting to compete with residential users.

      Generation is a tiny fraction of electricity charges though.

    • stogot 4 days ago ago

      I actually don’t like this measurement, as it’s vague and dilutes the announcement. Each product has a different efficiency of watts.

      Imagine Ford announced “a strategic partnership with FedEx to deploy 10 giga-gallons of ICE vehicles”

      • mensetmanusman 4 days ago ago

        It’s a sticky metric though because Moores law per power consumption died years ago.

    • dantillberg 4 days ago ago

      Prices of _everything_ went up over the past five years. Datacenter expansion was far from the main driver. Dollars and cents aren't worth what they used to be.

      • basilgohar 4 days ago ago

        Elsewhere it was mentioned that DCs pay less for electricity per Wh than residential customers. If that is the case, then it's not just about inflation, but also unfair pricing putting more of the infrastructure costs on residential customers whereas the demand increase is coming from commercial ones.

        • aaronmdjones 4 days ago ago

          Industrial electricity consumers pay lower unit rates per kWh, but they also pay for any reactive power that they consume and then return -- residential consumers do not. As in, what industrial consumers actually pay is a unit cost per kVAh, not kWh.

          This means loads with pretty abysmal power factors (like induction motors) actually end up costing the business more money than if they ran them at home (assuming the home had a sufficient supply of power).

          Further, they get these lower rates in exchange for being deprioritised -- in grid instability (e.g. an ongoing frequency decline because demand outstrips available supply), they will be the first consumers to be disconnected from the grid. Rolling blackouts affecting residential consumers are the last resort.

          There are two sides to this coin.

          Note that I am in no way siding with this whole AI electricity consumption disaster. I can't wait for this bubble to pop so we can get back to normality. 10GW is a third of the entire daily peak demand of my country (the United Kingdom). It's ridiculous.

          Edit: Practical Engineering (YouTube channel) has a pretty decent video on the subject. https://www.youtube.com/watch?v=ZwkNTwWJP5k

    • randomNumber7 4 days ago ago

      I mean gigawatts is a concise metric to get a grasp of the amount of gpu compute they install, but the honesty seems a bit strange to me imo.

      • fuzzfactor 4 days ago ago

        Total gigawatts is the maximum amount of power that can be supplied from the power generating station and consumed at the DC through the infrastructure and hardware as it was built.

        Whether they use all those gigawatts and what they use them for would be considered optional and variable from time to time.

    • mullingitover 4 days ago ago

      > So where is this 10GW electric supply going to come from

      If the US petro-regime wasn't fighting against cheap energy sources this would be a rounding error in the country's solar deployment.

      China deployed 277GW of solar in 2024 and is accelerating, having deployed 212GW in the first half of 2025. 10 GW could be a pebble in the road, but instead it will be a boulder.

      Voters should be livid that their power bills are going up instead of plummeting.

      • Saline9515 3 days ago ago

        Fyi capacity announced is very far from the real capacity when dealing with renewables. It's like saying that you bought a Ferrari so now you can drive at 300km/h on the road all of the time.

        In mid latitudes, 1 GW of solar power produces around 5.5 GWh/day. So the "real" equivalent is a 0.23 GW gas or nuclear plant (even lower when accounting for storage losses).

        But "China installed 63 GW-equivalent" of solar power is a bit less interesting, so we go for the fake figures ;-)

        • FooBarWidget 3 days ago ago

          You think they don't know that too? You can bet they're investing heavily in grid-level storage too.

          • Saline9515 3 days ago ago

            I was commenting the initial number announcement. And storage at this scale right now doesn't exist. The most common way, water reservoirs, requires hard-to-find sites that are typically in the Himalaya, so far away from the production place. And the environmental cost isn't pretty either.

      • parineum 4 days ago ago

        I'm living in one of the most expensive electricity markets in the US. It has a lot more to do with the state shutting down cheap petro energy (natural gas) and nuclear then replacing it with... tbd.

      • bushbaba 4 days ago ago

        How would that solar power a DC at night or on a cloudy day? Energy storage isn’t cheap.

        • mullingitover 4 days ago ago

          In 2025 it’s cheaper to demolish an operating coal plant and replace it with solar and battery, and prices are still dropping.

          • parineum 4 days ago ago

            Why aren't all these businesses doing that then?

    • p1necone 4 days ago ago

      Theoretically couldn't you use all the waste heat from the data center to generate electricity again, making the "actual" consumption of the data center much lower?

      • quasse 4 days ago ago

        Given that steam turbine efficiency depends on the temperature delta between steam input and condenser, unlikely unless you're somehow going to adapt Nvidia GPUs to run with cooling loop water at 250C+.

      • pjc50 3 days ago ago

        Thermodynamics says no. In fact you have to spend energy to remove that heat from the cores.

        (Things might be different if you had some sort of SiC process that let you run a GPU at 500C core temperatures, then you could start thinking of meaningful uses for that, but you'd still need a river or sea for the cool side just as you do for nuclear plants)

      • distances 3 days ago ago

        In the Nordics the waste heat is used for district heating. This practical heat sink really favors northern countries for datacenter builds. In addition you usually get abundant water and lower population density (meaning easier to build renewables that have excess capacity).

      • Blackthorn 4 days ago ago

        No.

  • paxys 4 days ago ago

    > letter of intent for a landmark strategic partnership

    > intends to invest up to xxx progressively

    > preferred strategic compute and networking partner

    > work together to co-optimize their roadmaps

    > look forward to finalizing the details of this new phase of strategic partnership

    I don't think I have seen so much meaningless corporate speak and so many outs in a public statement. "Yeah we'll maybe eventually do something cool".

    • BHSPitMonkey 4 days ago ago

      NVDA's share price enjoyed a nice $6 bump today, so the announcement did what it was supposed to do.

      In a sense, it's just an ask to public investors for added capital to do a thing, and evidently a number of investors found the pitch compelling enough.

      • paxys 4 days ago ago

        Increase in share price doesn't provide extra cash to a company. They'd have to issue new shares for that.

        • jedberg 4 days ago ago

          It doesn't directly, but it helps because they can do deals where they buy things with stock, like people's labor or small companies, and now that "money" is more valuable.

        • onesociety2022 4 days ago ago

          It does help with employee stock compensation. If your stock doubled in the past year, then you just need to dole out 50% of shares as last year in equity refreshers to retain talent.

          • paxys 4 days ago ago

            Nvidia probably has the opposite problem - employee stock has appreciated so much that you have to convince them not to retire.

            • onesociety2022 4 days ago ago

              Maybe but people's spending also dramatically goes up as they start making more money. You buy that $5m vacation home at Tahoe, you buy fully-loaded Rivian SUVs, you send your kids to expensive private schools, you fly only first-class on family vacations, and you are back to needing to work more to sustain this lifestyle.

          • nenenejej 4 days ago ago

            This assumes your staff are not a bunch of boglehead freaks constantly on blind and crunching spreadsheets and grinding their leetcode for that perfectly timed leap.

            RSU vesting is a bit like options. You have the option but not the obligation to stay in the job!

        • hshshshshsh 4 days ago ago

          But company owns stock right? So they can sell those stocks no?

          • paxys 4 days ago ago

            It can, but investors don't like that since it dilutes the value of their own shares. Which is why large companies usually do the opposite - share buybacks. Nvidia in fact bought $24 billion worth of its own shares in the first half of 2025, and plans to spend $60 billion more in buybacks in upcoming months.

            • littlecranky67 4 days ago ago

              Which investors also usually don't like. It says "we have all this cash, but we have no idea what to do with it so we are buying out own stock". While I'd expect a company to actually invest (into research, tech, growth etc.) with it's excess cash to make more money in the future.

              • inemesitaffia 4 days ago ago

                Preferred by some to dividends.

                • roland35 4 days ago ago

                  If stock buybacks cause the price to go up like it should in theory, that's less of a tax hit than dividends! I'll take it

            • lotsofpulp 4 days ago ago

              That has to be compared with how much stock the company is “selling”, via equity compensation to employees.

    • casey2 3 days ago ago

      The "meaning" is clear, create FOMO among suckers.

  • ddtaylor 4 days ago ago

    For someone who doesn't know what a gigawat worth of Nvidia systems is, how many high-end H100 or whatever does this get you? My estimates along with some poor-grade GPT research leads me to think it could be nearly 10 million? That does seem insane.

    • kingstnap 4 days ago ago

      It's a ridiculous amount claimed for sure. If its 2 kW per it's around 5 million, and 1 to 2 kW is definitely the right ballpark at a system level.

      The NVL72 is 72 chips is 120 kW total for the rack. If you throw in ~25 kW for cooling its pretty much exactly 2 kW each.

      • fuzzfactor 4 days ago ago

        So each 2KW component is like a top-shelf space heater which the smart money never did want to run unless it was quite cold outside.

        • willis936 4 days ago ago

          It will be the world's most advanced resistor.

    • thrtythreeforty 4 days ago ago

      Safely in "millions of devices." The exact number depends on assumptions you make regarding all the supporting stuff, because typically the accelerators consume only a fraction of total power requirement. Even so, millions.

      • cj 4 days ago ago

        "GPUs per user" would be an interesting metric.

        (Quick, inaccurate googling) says there will be "well over 1 million GPUs" by end of the year. With ~800 million users, that's 1 NVIDIA GPU per 800 people. If you estimate people are actively using ChatGPT 5% of the day (1.2 hours a day), you could say there's 1 GPU per 40 people in active use. Assuming consistent and even usage patterns.

        That back of the envelope math isn't accurate, but interesting in the context of understanding just how much compute ChatGPT requires to operate.

        Edit: I asked ChatGPT how many GPUs per user, and it spit out a bunch of calculations that estimates 1 GPU per ~3 concurrent users. Would love to see a more thorough/accurate break down.

        • coder543 4 days ago ago

          A lot of GPUs are allocated for training and research, so dividing the total number by the number of users isn’t particularly useful. Doubly so if you’re trying to account for concurrency.

        • NooneAtAll3 4 days ago ago

          I'm kinda scared of "1.2 hours a day of ai use"...

          • Rudybega 4 days ago ago

            Sorry, those figures are skewed by Timelord Georg, who has been using AI for 100 million hours a day, is an outlier, and should have been removed.

            • fuzzfactor 4 days ago ago

              Roger, but I still think with that much energy at its disposal, if AI performs as desired it will work it's way up to using each person more than 1.2 hours per day, without them even knowing about it :\

              • Nevermark 4 days ago ago

                When GPUs share people concurrently, they collectively get much more than 24 hours of person per day.

                • fuzzfactor 4 days ago ago

                  You're right!

                  With that kind of singularity the man-month will no longer be mythical ;)

    • sandworm101 4 days ago ago

      At this scale, I would suggest that these numbers are for the entire data center rather than a sum of the processor demands. Also the "infrastructure partnership " language suggest more than just compute. So I would add cooling into the equation, which could be as much a half the power load, or more depending on where they intend to locate these datacenters.

    • skhameneh 4 days ago ago

      Before reading your comment I did some napkin math using 600W per GPU: 10,000,000,000 / 600 = 16,666,666.66...

      With varying consumption/TDP, could be significantly more, could be significantly less, but at least it gives a starting figure. This doesn't account for overhead like energy losses, burst/nominal/sustained, system overhead, and heat removal.

    • alphabetag675 4 days ago ago

      Account for around 3MW for every 1000 GPUs. So, 10GW is around 333 * 10 * 3MW so 3.33 * 1k * 1k GPUs, so around 3.33 M GPUs

    • ProofHouse 4 days ago ago

      How much cable (and what kind) to connect them all? That number would be 100x the number of gpus. I would think they just clip on metal racks no cables but then I saw the xai data center that can blue wire cables everywhere

    • iamgopal 4 days ago ago

      and How much is that in terms of percentage of bitcoin network capacity ?

      • mrb 4 days ago ago

        Bitcoin mining consumes about 25 GW: https://ccaf.io/cbnsi/cbeci so this single deal amounts to about 40% of that.

        To be clear, I am comparing power consumption only. In terms of mining power, all these GPUs could only mine a negligible fraction of what all specialized Bitcoin ASIC mine.

        Edit: some math I did out of sheer curiosity: a modern top-of-the-line GPU would mine BTC at about 10 Ghash/s (I don't think anyone tried but I wrote GPU mining software back in the day, and that is my estimate). Nvidia is on track to sell 50 million GPUs in 2025. If they were all mining, their combined compute power would be 500 Phash/s, which is 0.05% of Bitcoin's global mining capacity.

      • cedws 4 days ago ago

        I'm also wondering what kind of threat this could be to PoW blockchains.

        • typpilol 4 days ago ago

          Literally none at all because asic

          • fuzzfactor 4 days ago ago

            What happens if AI doesn't pay off before the GPUs wear out or are in need of replacement?

            So at that point a DC replaces them all with ASICs instead?

            Or if they just feel like doing that any time.

          • cedws 3 days ago ago

            Some chains are designed to be ASIC resistant.

    • az226 3 days ago ago

      Vera Rubin will be about 2.5kw and Feynman will be about 4kw.

      All-in, you’re looking at a higher footprint maybe 4-5kw per GPU blended.

      So about 2 million GPUs.

    • awertjlkjl 4 days ago ago

      You could think of it as "as much power as is used by NYC and Chicago combined". Which is fucking insanely wasteful.

      • onlyrealcuzzo 4 days ago ago

        I dunno.

        Google is pretty useful.

        It uses >15 TWh per year.

        Theoretically, AI could be more useful than that.

        Theoretically, in the future, it could be the same amount of useful (or much more) with substantially less power usage.

        It could be a short-term crunch to pull-forward (slightly) AI advancements.

        Additionally, I'm extremely skeptical they'll actually turn on this many chips using that much energy globally in a reasonable time-frame.

        Saying that you're going to make that kind of investment is one thing. Actually getting the power for it is easier said than done.

        VC "valuations" are already a joke. They're more like minimum valuations. If OpenAI is worth anywhere near it's current "valuations", Nvidia would be criminally negligent NOT to invest at a 90% discount (the marginal profit on their chips).

        • dns_snek 4 days ago ago

          According to Google's latest environmental report[1] that number was 30 TWh per year in 2024, but as far as I can tell that's their total consumption of their datacenters, which would include everything from Google Search, to Gmail, Youtube, to every Google Cloud customer. Is it broken down by product somewhere?

          30 TWh per year is equivalent to an average power consumption of 3.4 GW for everything Google does. This partnership is 3x more energy intensive.

          Ultimately the difference in `real value/MWh` between these two must be many orders of magnitude.

          [1] https://sustainability.google/reports/google-2025-environmen...

          • onlyrealcuzzo 3 days ago ago

            Data centers typically use 60% (or less) on average of their max rating.

            You over-provision so that you (almost) always have enough compute to meet your customers needs (even at planet scale, your demand is bursty), you're always doing maintenance on some section, spinning up new hardware and turning down old hardware.

            So, apples to apples, this would likely not even be 2x at 30TWh for Google.

        • tmiku 4 days ago ago

          For other readers: "15 Twh per year" is equivalent to 1.71 GW, 17.1% of the "10GW" number used to describe the deal.

          • mNovak 4 days ago ago

            This is ignoring the utilization factor though. Both Google and OpenAI have to overprovision servers for the worst case simultaneous users. So 1.71 GW average doesn't tell use the maximum instantaneous GW capacity of Google -- if we pull a 4x out of the hat (i.e. peak usage is 4x above average), it becomes ~7 GW of available compute.

            More than a "Google" of new compute is of course still a lot, but it's not many Googles' worth.

        • Capricorn2481 4 days ago ago

          Does Google not include AI?

      • jazzyjackson 4 days ago ago

        I mean if 10GW of GPUs gets us AGI and we cure cancer than that's cool, but I do get the feeling we're just getting uncannier chatbots and fully automated tiktok influencers

        • yard2010 4 days ago ago

          Current llms are just like farms. Instead of tomatoes by the pound you buy tokens by the pound. So it depends on the customers.

        • junon 4 days ago ago

          This is also my take. I think a lot of people miss the trees for the forest (intentionally backward).

          AI that could find a cure for cancer isn't the driving economic factor in LLM expansion, I don't think. I doubt cancer researchers are holding their breath on this.

        • rebolek 4 days ago ago

          And when it’s built, Sam Altman will say: We are so close, if we get 10TW, AGI will be here next year!

      • diego_sandoval 4 days ago ago

        Do you think the existence of NYC and Chicago is insanely wasteful?

  • seydor 4 days ago ago

    We are way past peak LLM and it shows. They are basically advertise spacing heating as if it's some sort of advancement, while the tech seems to have stagnated, and they re just making the horses faster. The market should have punished this

    • nilkn 4 days ago ago

      It's 100% plausible and believable that there's going to be a spectacular bubble popping, but saying we are way past peak LLM would be like saying we were way past peak internet in 1999-2001 -- in reality, we weren't even remotely close to peak internet (and possibly still aren't). In fact, we were so far from the peak in 2001 that entire technological revolutions occurred many years later (e.g., smartphones) that just accelerated the pace even further in ways that would've been hard to imagine at the time. It's also important to note that AI is more than text-based LLMs -- self-driving cars and other forms of physical "embodied" AI are progressing at exponential pace, while entirely new compute form factors are only just now starting to emerge yet are almost certainly guaranteed to become pervasive as soon as possible (e.g., real AR glasses). Meanwhile, even plain-old text-based LLMs have not actually stagnated.

      • huijzer 4 days ago ago

        [flagged]

        • aurareturn 4 days ago ago

            “You should expect OpenAI to spend trillions of dollars on data center construction in the not very distant future,” he told the room, according to a Verge reporter.
          
            “We have better models, and we just can’t offer them, because we don’t have the capacity,” he said. GPUs remain in short supply, limiting the company’s ability to scale.
          
          https://finance.yahoo.com/news/sam-altman-admits-openai-tota...

          So why would Altman say AI is in a bubble but OpenAI wants to invest trillions? Here's my speculation:

          1. OpenAI is a private company. They don't care about their own stock price.

          2. OpenAI just raised $8.3b 3 weeks ago on $300b valuation ($500b valuation today). He doesn't care if the market drops until he needs to raise again.

          3. OpenAI wants to buy some AI companies but they're too expensive so he's incentivized to knock the price of those companies down. For example, OpenAI's $3b deal for Windsurf fell apart when Google stepped in and hired away the co-founder.

          4. He wants to retain OpenAI's talent because Meta is spending billions hiring away the top AI talent, including talent from OpenAI. By saying it's in a bubble and dropping public sentiment, the war for AI talent could cool down.

          5. He wants other companies to get scared and not invest as much while OpenAI continues to invest a lot so it can stay ahead. For example, maybe investors looking to invest in Anthropic, xAI, and other private companies are more shaky after his comments and invest less. This benefits OpenAI since they just raised.

          6. You should all know that Sam Altman is manipulative. This is how he operates. Just google "Sam Altman manipulative" and you'll see plenty of examples where former employees said he lies and manipulates.

          • savorypiano 4 days ago ago

            Altman wants OTHERS to spend trillions are GPU. He needs the scaling hype to continue so he can keep getting investors to put money in hopes of an AGI breakthrough. If there is no funding, OpenAI is immediately bankrupt.

    • MangoCoffee 4 days ago ago

      >We are way past peak LLM and it shows

      The dot com bubble saw crazy deals and valuations, followed by a crash.

      some companies emerged from it and went on to be a giant company like Amazon. Let's hope this AI boom have some similar outcomes.

      • spacebanana7 3 days ago ago

        In hindsight, the dot com bubble was really the dot com dip.

    • jama211 4 days ago ago

      There will be a great market correction soon. Long term though it’ll still have some value, much like after the dot com crash the internet still remained useful. I hope.

    • tim333 3 days ago ago

      Though "Compute infrastructure will be the basis for the economy of the future" doesn't sound that off. LLMs may go but compute will live on. Bit like web portals and optical fiber.

    • bwfan123 4 days ago ago

      > We are way past peak LLM

      in the sense that all of the positive narrative is getting priced in.

  • agentultra 4 days ago ago

    Water is a critical resource in dwindling supplies in many water-stressed regions. These data centers have been known to suck up water supplies during active droughts. Is there anyone left at the EPA that gets a say in how we manage water for projects like this?

    • thorncorona 4 days ago ago

      We deplete our midwestern aquifers to make ethanol which we burn, and we grow almonds in California.

      Both of those have significantly more water impact. Both of those are significantly less useful.

      Why not focus on issues that matter.

      • 34679 3 days ago ago

        Food and fuel are significantly more useful than chatbots.

      • AlecSchueler 3 days ago ago

        Why either/or? This is largely a tech forum so almond crops don't need to be the big area of focus or where we as a community can offer our best knowledge/coordination.

    • deelowe 4 days ago ago

      Water is much less of an issue than the media makes it out to be. It's a problem in some specific areas, yes, but power is a much better concern.

      • Manuel_D 4 days ago ago

        And even where water scarcity is a problem, heat exchangers can be configured to use wastewater. The Palo Verde plant does this.

        • deelowe 4 days ago ago

          Correct. There are a variety of solutions. Each DC is somewhat unique, but in general water isn't a huge concern. Cities make a big deal about it b/c they want the hyper scalers to give concessions such as processing gray water for the local muni.

    • MagicMoonlight 4 days ago ago

      Where is this water meme coming from? Surely the water is just pumped around, not actually used up?

      • p1mrx 4 days ago ago

        Evaporative cooling effectively "uses up" the water. It's possible to run chillers instead, but that consumes more electricity, and some power plants also use evaporative cooling.

        • kingstnap 4 days ago ago

          Some water usage has highly questionable counting methodologies.

          Like using if a datacenter is using hydroelectric power you count the evaporation from the dam reservoir as "used water".

          I'm not an expert but imo correct accounting should really only consider direct consumption. It's very silly when we play games like having petro states have very high carbon footprints even if they don't actually burn the fuel.

    • sandspar 4 days ago ago

      Am I correct that your argument is something like, "AI endangers our water supply"? If so, what evidence would it take for you to change your mind? Maybe someone here can provide it.

      • agentultra 3 days ago ago

        No.

        The argument is that water management policy is lacking and supplies are dwindling, shouldn’t we have better oversight of this resource before we let corporations run full speed ahead?

    • fatal_errr 4 days ago ago

      the e p what?

      • baggachipz 4 days ago ago

        The entire premise of The Simpsons Movie is an artifact of another time. Sigh.

    • SilverElfin 4 days ago ago

      Water pollution bigger danger than water usage. Look up videos of people whose water changes color after a data center was built nearby.

      • agentultra 4 days ago ago

        They cause all kinds of problems. We could even include all of the new methane power plants that will likely need to be built.

    • dopa42365 3 days ago ago

      Some 1400 cubic kilometers of water evaporate every day on our blue planet here. The water isn't deleted, really.

      • agentultra 2 days ago ago

        Ah you haven’t read the latest reports. We’re losing fresh water at a rate faster than models had anticipated. Once it joins the ocean, it takes a painfully long time to build up on land again. And with rising temperatures it’s not being retained on land as much.

        World is getting thirsty.

  • jlokier 4 days ago ago

    onlyrealcuzzo wrote:

    > Google is pretty useful. It uses 15 TWh per year.

    15TWh per year is about 1.7GW.

    Assuming the above figures, that means OpenAI and Nvidia new plan will consume about 5.8 Googles worth of power, by itself.

    At that scale, there's a huge opportunity for ultra-low-power AI compute chips (compared with current GPUs), and right now there are several very promising technology pathways to it.

    • lukan 4 days ago ago

      " there's a huge opportunity for ultra-low-power AI compute chips (compared with current GPUs), and right now there are several very promising technology pathways to it"

      Sharing an example would be nice. Of how much power reduction are we talking here?

    • OtherShrezzing 4 days ago ago

      This one datacenter should be able to perform a 51% attack on any of the big cryptocurrencies with that much compute.

      An interesting hedge in case the AI bubble pops.

      • typpilol 4 days ago ago

        Someone did the math above and said all of it would only be about 0.05 percent for Bitcoin.

        I'm not sure about the GPU pow coins though

      • Lionga 4 days ago ago

        Nope, anything besides ASICs are useless for crypto mining.

      • dgfitz 4 days ago ago

        Kansas City shuffle?

      • aurareturn 4 days ago ago

        You're downvoted but it's a real threat. Imagine hackers or state sponsored entities use one of these mega data centers to destroy a few cryptocurrencies.

        • typpilol 4 days ago ago

          They are nothing compared to BTC Asics

          • kevinrineer 4 days ago ago

            Comparing 100 duck sized horses to 1 horse sized duck. Or perhaps the amount of GPUs is in the ratio of 1,000:1.

  • JoshGlazebrook 4 days ago ago

    No one ever talks about the electricity demands for powering these things. Electric bills here in NJ via PSEG have spiked over 50% and they are blaming increased demand from datacenters, yet they don't seem to charge datacenters more?

    https://www.datacenterdynamics.com/en/news/new-jersey-utilit...

    • bigwheels 4 days ago ago

      A classic political games move, and it says more about how much anti-consumer nonsense is tolerated in New Jersey than it does about power generation and distribution pricing realities.

      The data centers will naturally consolidate in areas with competitive electricity pricing.

      • Gud 4 days ago ago

        Define, "competitive electricity pricing". Because surely this will make the electricity prices less competitive for the you and me...

    • m101 3 days ago ago

      This is called marginal pricing. Everyone pays the price of the marginal producer.

      In some cases they try to get the data centres to pay for their infrastructure costs but the argument is that customers don't pay this normally but do so through usage fees over time.

    • lotsofpulp 4 days ago ago

      That is on NJ government for allowing the price increases. They can easily say no.

  • gorbypark 3 days ago ago

    That's a lot. I always had this idea in the back of my mind that British Columbia should get in on the AI game and try and get data centers located in BC because we generally have a lot of "excess" hydro generation capacity. There's a new mega dam recently opened that had lots of criticism about it being "unneeded".

    That mega dam (Site C) produces 1.1GW of energy.

  • fufxufxutc 4 days ago ago

    In accounting terms, this is a shady business practice known as "round tripping" where you invest in a company for the sole purpose of them buying your product. It allows you to count your revenue multiple times.

    • landl0rd 4 days ago ago

      Nvidia has consistently done this with Coreweave, Nscale, really most of its balance sheet investments are like this. On the one hand there's a vaguely cogent rationale that they're a strategic investor and it sort of makes sense as an hardware-for-equity swap; on the other, it's obviously goosing revenue numbers. This is a bigger issue when it's $100B than with previous investments.

      It's a good time to gently remind everyone that there are a whole pile of legal things one can do to change how a security looks "by the numbers" and this isn't even close to the shadiest. Heck some sell-side research makes what companies themselves do look benign.

      • yannyu 4 days ago ago

        A relevant joke, paraphrased from the internet:

        Two economists are walking in a forest when they come across a pile of shit.

        The first economist says to the other “I’ll pay you $100 to eat that pile of shit.” The second economist takes the $100 and eats the pile of shit.

        They continue walking until they come across a second pile of shit. The second economist turns to the first and says “I’ll pay you $100 to eat that pile of shit.” The first economist takes the $100 and eats a pile of shit.

        Walking a little more, the first economist looks at the second and says, "You know, I gave you $100 to eat shit, then you gave me back the same $100 to eat shit. I can't help but feel like we both just ate shit for nothing."

        "That's not true", responded the second economist. "We increased total revenue by $200!"

        • paxys 4 days ago ago

          The punchline is supposed to be GDP, but yeah, same concept.

      • hoosieree 4 days ago ago

        This should go without saying but unfortunately it really doesn't these days:

        This kind of corporate behavior is bad and will end up hurting somebody. If we're lucky the fallout will only hurt Nvidia. More likely it will end up hurting most taxpayers.

    • rzerowan 4 days ago ago

      Its the same loop de loop NVIDIA is doing with Coreweave as i understand.'Investing' in coreweave which then 'buys' NVIDIA merch for cloud rental , resulting in Coreweave being the top 4 customers of NVIDIA chips.

      • vessenes 4 days ago ago

        Wait, why the quotes? NVDA sends cash, and the Coreweave spends it, no? I don’t think quotes are accurate, if they imply these transactions aren’t real, and material. At the end of the day, NVDA owns Coreweave stock, and actual, you know, physical hardware is put into data centers, and cash is wired.

        • rzerowan 3 days ago ago

          Well we do have the precedent of HPE/Autonomy in the UK [1], which ruled that the process is essentially fraud. Whether there will be a prosecution in the current corporate environment remains to be seen. Essentially though the roundtrip revenue inflation was already ruled illegal.

          [1]https://www.corpdev.org/2025/07/23/hp-awarded-945-million-in...

    • GuB-42 4 days ago ago

      I don't really understand how it is round tripping.

      In the end, Nvidia will have OpenAI shares, which are valuable, and OpenAI will have GPUs, which are also valuable. It is not fake revenue, the GPUs will be made, sold at market price, and used, they are not intended to be bought back and sold to another customer. And hopefully, these GPUs will be put to good use by OpenAI so that they can make a profit, which will give Nvidia some return on investment.

      It doesn't look so different from a car loan, where the dealer lends you the money so that you can buy their car.

      • nmfisher 4 days ago ago

        A dollar is always a dollar, so it's hard to claim that $1 million in revenue is actually worth $10 million. OpenAI shares, on the other hand, aren't publicly traded, so it's much easier to claim they're worth $10 million when noone would actually be willing to buy for more than $1 million.

        It's not necessarily manipulative but it's also not exactly an arms-length purchase of GPUs on the open market.

      • udkl 4 days ago ago

        It looks like NVDIA looking to move up the value chain to have a stake in the even higher margin/addressable market instead of simply selling the tools.

      • treis 4 days ago ago

        If OpenAI doesn't pan out than Nvidia has worthless OpenAI stock and OpenAI has a pile of mostly useless GPUs.

        • dwaltrip 4 days ago ago

          That’s still not round tripping?

    • FinnKuhn 4 days ago ago

      They for example did a similar deal with Nscale just last week.

      https://www.cnbc.com/2025/09/17/ai-startup-nscale-from-uk-is...

    • Aurornis 4 days ago ago

      This is being done out in the open (we’re reading the press announcement) and will be factored into valuations.

      Also, investing in OpenAI means they get equity in return, which is not a worthless asset. There is actual mutually beneficial trade occurring.

      • mxschumacher 3 days ago ago

        what is OpenAI's durable, competitive advantage that differentiates it against the numerous other LLM providers? Investing at a $500bn valuation for a company that's losing money and has bad unit economics this seems rather aggressive.

    • klysm 4 days ago ago

      Is it counting revenue multiple times? It's buying your own products really, but not sure how that counts as double counting revenue

      • rsstack 4 days ago ago

        Customer A pays you $100 for goods that cost you $10. You invest $100-$10=$90 in customer B so that they'll pay you $90 for goods that cost you $9. Your reported revenue is now $100+$90=$190, but the only money that entered the system is the original $100.

        • Aurornis 4 days ago ago

          Yes, but you’ve also incurred a $90 expense in purchasing the stock of Company B and that stock is on the balance sheet.

          In the actual shady version of this, Company B isn’t the hottest AI investment around, it’s a shell company created by your brother’s cousin that isn’t actually worth what you’re claiming on the balance sheet because it was only created for the round tripping shell game.

        • creddit 4 days ago ago

          Except that this is isn't round-tripping at all. Round-tripping doesn't result in a company actually incurring expenses to create more product. Round-tripping is the term for schemes that enable you to double count assets/revenue without any economic effects taking place.

          Every time HackerNews talks about anything in the legal or finance realm, people trip over themselves to make arguments for why something a big tech is doing is illegal. This is definitively neither illegal nor shady. If Nvidia believes, for example, that OpenAI can use their GPUs to turn a profit, then this is inherently positive sum economically for both sides: OpenAI gets capital in the form of GPUs, uses them to generate tokens which they sell above the cost of that capital and then the return some of the excess value to Nvidia. This is done via equity. It's a way for Nvidia to get access to some of the excess value of their product.

          • bob1029 4 days ago ago

            At some point one might simply argue that the nature and timing of these wildly fantastical press releases is tantamount to a "scheme to defraud".

            • creddit 4 days ago ago

              “ Every time HackerNews talks about anything in the legal or finance realm, people trip over themselves to make arguments for why something a big tech is doing is illegal.”

        • FinnKuhn 4 days ago ago

          And your evaluation also rises as a consequence of your increased revenue.

      • lumost 4 days ago ago

        It's real revenue, but you are operating a fractional reserve revenue operation. If the person your investing in has trouble, or you have trouble - the whole thing falls over very fast.

      • fufxufxutc 4 days ago ago

        The "investment" came from their revenue, and will be immediately counted in their revenue again.

        • weego 4 days ago ago

          In this case it seems that if we're being strict here the investment could then also show up as fixed assets on the same balance sheet

    • t0mas88 4 days ago ago

      Oracle also announced a lot of future revenue from AI, while they're part of Stargate Partners that is investing in OpenAI. Similar deal...

    • Mistletoe 4 days ago ago

      Isn’t our stock market basically propped up on this AI credits etc. house of cards right now?

    • mandeepj 4 days ago ago

      > this is a shady business practice known as "round tripping" where you invest in a company for the sole purpose of them buying your product.

      Microsoft and Google have been doing it for decades. Probably, MS started that practice.

    • rsync 4 days ago ago

      "In accounting terms, this is a shady business practice known as "round tripping" where you invest in a company for the sole purpose of them buying your product. It allows you to count your revenue multiple times."

      ... and we've seen this before in previous bubbles ...

    • selectodude 4 days ago ago

      This is some Enron shit. Lets see NVDA mark to market these profits. Keep the spice flowing.

  • moduspol 4 days ago ago

    Waiting patiently for the Ed Zitron article on this...

    • gitremote 4 days ago ago

      When executives can't measure success by output, they measure success by input, a perverse incentive that rewards inefficiency.

      Execs ask their employees to return to office, because they don't know how to measure good employee output.

      Now OpenAI and Nvidia measure success by gigawatt input into AI instead of successful business outcomes from AI.

    • timmytokyo 4 days ago ago
    • nextworddev 4 days ago ago

      He single-handedly cost people more than anyone with his bearish takes lol

      • topaz0 4 days ago ago

        Or he saved them more than anyone by limiting their losses when it does finally crash

        • nextworddev 4 days ago ago

          except he called the top in 2023

      • mikhmha 4 days ago ago

        Its not a good argument against him. I read his articles and he is absolutely correct about the state of things. Predicting the crash is a fools errand. I don't use that as a argument to discredit what he actually writes regarding the raw economics of the AI industry.

        I say this as someone who has been holding NVDA stock since 2016 and can cash out for a large sum of money. To me its all theoretical money until I actually sell. I don't factor it into financial planning.

        You don't see me being a cheerleader for NVDA. Even though I stand to gain a lot. I will still tell you that the current price is way too high and Jensen Huang has gotten high off his own supply and "celebrity status".

        After all, we all can't buy NVDA stock and get rich off it. Is it truly possible for all 30,000+ NVDA employees to become multi-millionaires overnight? That's not how capitalism works.

        • locallost 3 days ago ago

          I am all against bubbles and irrational valuations etc. but I think in this case the prospect of future growth was fully justified. There are never guarantees, but Nvidia's price went up 10x or more in three years and e.g. their PER stayed mostly flat. But their PER of 50 three years ago would be 5 today, which would be extremely undervalued. I would say the "market" got it correctly this time.

        • nextworddev 4 days ago ago

          He’s been absolutely wrong on most things but spreading FUD is how he makes money, like Gary Marcus

          • mikhmha 4 days ago ago

            I don't care for personalities. You want to mark him as a grifter but is that just an emotional response? I have not bought anything from Ed, I don't subscribe to his newsletter, I don't know much about him beyond visiting his website every few weeks and reading the free articles. He does not sell me vitality pills or coffee mugs. The only soliciting he does is his paid sub stack.

            But it goes both ways? Because AI promoters are also spreading FUD. That's how they make money. Because their livelihoods are tied to this technology and all the valuations. So is spreading FUD for you just a condition on whether or not you agree with the person?

          • watwut 4 days ago ago

            If there is any FUD, it is feom other side. No one is scared after they read Zitron article, most are bored because they are dense to read.

            But people are literally scared ai will destroy all the jobs after reading articles about how it will. Companies scared not to use ai whether it makes sense or not just to not miss out is where FUD is.

  • isodev 4 days ago ago

    > Strategic partnership enables OpenAI to build and deploy at least 10 gigawatts of AI datacenters with NVIDIA systems representing millions of GPUs

    I know watts but I really can’t quantify this. How much of Nvidia is there in the amount of servers that consume 10GW? Do they all use the same chip? What if there is newer chip that consumes less, does the deal imply more servers? Did GPT write this post?

    • mr_toad 4 days ago ago

      You don’t need AI to write vague waffly press releases. But to put this in perspective an H100 has a TDP of 700 watts, the newer B100s are 1000 watts I think?

      Also, the idea of a newer Nvidia card using less power is très amusant.

    • nick__m 4 days ago ago

      A 72 GPUs NVL72 rack consumes up to 130kW, so it's a little more than 5 500 000 GPUs

    • az226 3 days ago ago

      $150-200B worth of hardware. About 2 million GPUs.

      So this investment is somewhat structured like the Microsoft investment where equity was traded for Azure compute.

  • zitterbewegung 4 days ago ago

    To put this into perspective this datacenter would have the land area of Monaco (740 acres) given assumptions of a 80kW/rack per case.

    • oezi 4 days ago ago

      Monaco is so tiny it fits into Berlin's Tempelhofer Feld (a circular park inside the city).

      • ben_w 4 days ago ago

        I mean, you're not wrong, but Tempelhofer is also a former airport, so had to be quite big. And since Brexit, Berlin is the biggest city in the EU.

        • rhubarbtree 4 days ago ago

          Paris? I mean, if we are considering the wide area. Otherwise London wouldn’t have been considered the largest.

    • vessenes 4 days ago ago

      So, basically a single BYD factory

    • dguest 4 days ago ago

      Monaco is 2 km^2 [1].

      I'm confused because if I assume each rack takes up 1 square meter I get a much smaller footprint: around 12 hectares or 17 football fields.

      And that assumes that the installation is one floor. I don't know much about data centers but I would have thought they'd stack them a bit.

      Am I the only person who had to look up how big Monaco was?

      [1]: https://en.wikipedia.org/wiki/Monaco

      [2]: https://www.wolframalpha.com/input?i=10+GW+%2F+%2880kw+%2F+m...

    • ahmeneeroe-v2 4 days ago ago

      To put Monaco in perspective, the US could fit 4.8M Monacos

  • labrador 4 days ago ago

    For scale: The 1960's era US Navy submarine I served on had a 78MW reactor, so 10GW is 128 nuclear submarines

    • Recursing 4 days ago ago

      For a better sense of scale: it's about 2% of the average US electricity consumption, and about the same as the average electricity consumption of the Netherlands (18 million people)

      • tonyhart7 4 days ago ago

        Wtf, and this is from 1 company

        How many atrophic,xAi,google,Microsoft would be????

        having around 5% entire country infrastructure on AI hardware seems excessive no???

        • Recursing 4 days ago ago

          No this is from just one partnership, my sense is that OpenAI alone wants more than that.

          5% to 10% of US electricity going to AI in 10 years is consistent with the current valuations of AI companies.

        • bcrosby95 4 days ago ago

          Around 5% in the next 5 years for AI alone sounds pretty in-line with projections I've seen.

          • randomNumber7 4 days ago ago

            Isn't this pretty bad for the climate? I don't dare to ask ChatGPT now /S

      • udkl 4 days ago ago

        For another sense of scale: A 500MW AI-centric datacenter could cost $10 billion or more to build. So 10GW is $200 billion!

    • gehsty 4 days ago ago

      Some more context, Nuclear power stations can be up to 2GW, offshore windfarms are seemingly hitting a plateau at ~1.5GW, individual turbines in operations now are 15MW. Grids are already strained, 525kV DC systems can transmit ~2GW of power per cable bundle…

      Adding 10GW of offtake to any grid is going to cause significant problems and likely require CAPEX intensive upgrades (try buy 525kV dc cable from an established player and you are waiting until 2030+), as well as new generation for the power!

      • onesociety2022 4 days ago ago

        But that's assuming they actually have to transport power over long distances right? If they colocate these massive AI datacenters right next to the power generation plants, it should be cheap to transport the power. You don't need to upgrade massive sections of the grid and build long-distance power lines.

        The xAI Colossus 2 1GW data centers seem to be located about ~20 miles from the power generation utility (https://semianalysis.com/2025/09/16/xais-colossus-2-first-gi...)

        • gehsty 3 days ago ago

          20 miles is a long way to move power, on land you have huge issues over getting permits for construction as it’s so disruptive, offshore specialist vessels that serve a global existing supply chain.

      • vessenes 4 days ago ago

        Yeah the path forward here is going to be Apple-like vertical supply chain integration. There is absolutely no spare capacity in the infra side of electrical right now, at least in the US.

        • wongarsu 4 days ago ago

          And there is great cost saving potential in vertical integration. Distribution and transmission are huge costs. If you can build a data center right next to a power plant and just take all their power you get much better prices. Not trivial to do with the kinds of bursty loads that seem typical of AI data centers, but if you can engineer your way to a steady load (or at least steady enough that traditional grid smoothing techniques work) you can get a substantial advantage

          • theptip 4 days ago ago

            > bursty loads that seem typical of AI data centers

            Don’t datacenters want to run at their rated capacity 24/7?

        • gehsty 4 days ago ago

          I don’t think that’s possible with large scale power infrastructure, and specifically grid infrastructure is so tightly regulated. Closest that I’m aware of was TSMC buying the output of an entire offshore windfarm for 25yrs (largest power purchase contract ever - TSMC / Ørsted)… maybe Microsoft re starting nuclear power plants, or Google reporting offshore wind sites come out of contract (but nothing at the 10GW scale).

      • rlv-dan 4 days ago ago

        In the long run, perhaps this will give us a better power grid, just like the dotcom bubble gave rise to broadband?

    • melenaboija 4 days ago ago

      This still blows my mind.

      If each human brain consumes ~20W then 10 GW is like 500 M people, that sounds like a lot of thinking. Maybe LLMs are moving in the complete opposite direction and at some point something else will appear that vaporizes this inefficiency making all of this worthless.

      I don’t know, just looking at insects like flies and all the information they manage to process with what I assume is a ridiculous amount of energy suggests to me there must be a more efficient way to ‘think’, lol.

      • sindriava 4 days ago ago

        We know for a fact that current LLMs are massively inefficient, this is not a new thing. But every optimization you make will allow you to run more inference with this hardware, there's not a reason for it to make it meaningless any more than more efficient cars didn't obsolete roads.

        • dragonwriter 4 days ago ago

          > But every optimization you make will allow you to run more inference with this hardware

          Unless the optimization relies in part on a different hardware architecture, and is no more efficient than current techniques on existing hardware.

          > there's not a reason for it to make it meaningless any more than more efficient cars didn't obsolete roads

          Rail cars are pretty darned efficient, but they don’t really work on roads made for the other kind.

    • Muromec 4 days ago ago

      Or just ten very safe РБМК reactors rated 1GW each (they can't explode).

      • fragmede 4 days ago ago

        You almost got me. RBMKs had this problem with large positive void coefficients that was buried by the Soviet Union, which lead to Chernobyl.

        • fusionadvocate 4 days ago ago

          The control rods with the graphite on the tip was the cherry on top...

    • HarHarVeryFunny 4 days ago ago

      A big power station of any type is ~1GW. Nuclear is slow to build, so I'd have to guess natural gas.

      • gpm 4 days ago ago

        The US is adding significantly more solar, and slightly more wind, than natural gas every year. This doesn't have to be placed where people already are, but can be placed where energy is the cheapest, which favours solar and wind substantially more than gas (or nuclear).

        The reasonable (cost effective, can be done quickly) thing to do is put this wherever you can generate solar + wind the most reliably, build out a giant battery bank, and use the grid as a backup generator. Over time build a better and better connection to the grid to sell excess energy.

        • Workaccount2 4 days ago ago

          Trump is personally and vindictively against green energy.

          He wants coal and gas.

          • ViscountPenguin 4 days ago ago

            When the price of gas is so much higher than solar, that hardly matters. No reason the data centre have to be in the US.

      • happosai 4 days ago ago

        It should be illegal to build that much fossil fuel powerplants to just train LLMs.

        The platant disregard of global warming by AI investors is truly repulsive.

        • ETH_start 3 days ago ago

          Keep in mind that the industrial processes that consume fossil fuel also contribute to quality of life in various ways. Improvements in emergency response and early detection infrastructure alone have resulted in deaths from extreme weather events reaching record low levels. Poverty as a whole has seen record-breaking decreases over the last 30 years.

          So there are other factors to weigh besides how much contributes to CO2 emissions.

    • dguest 4 days ago ago

      A typical reactor core is 1 GW, so it's also one rather big nuclear power plant.

      • Muromec 4 days ago ago

        More like two (and a half )

  • gmm1990 4 days ago ago

    Strange unit of measurement. Who would find that more useful than expected compute or even just the number of chips.

    • skhameneh 4 days ago ago

      I wouldn't be surprised if power consumption is a starting point due to things like permitting and initial load planning.

      I imagine this as a subtractive process starting with the maximum energy window.

    • zozbot234 4 days ago ago

      It's a very useful reference point actually because once you hit 1.21 GW the AI model begins to learn at a geometric rate and we finally get to real AGI. Last I've heard this was rumored as a prediction for AI 2027, so we're almost there already.

      • outside2344 4 days ago ago

        Is this a crafty reference to Back to the Future? If so I applaud you.

      • the_70x 2 days ago ago

        Came only here searching for 1.21GW

      • jsnell 4 days ago ago

        1.21GW is an absurd level of precision for this kind of prediction.

        • leptons 4 days ago ago

          It's from the movie "Back to the Future"

    • isoprophlex 4 days ago ago

      If a card costs x money, and operating it every year/whatever costs y money in electricity, and y >> x, it makes sense to mostly talk about the amount of electricity you are burning.

      Because if some card with more FLOPS comes available, and the market will buy all your FLOPS regardless, you just swap it in at constant y / for no appreciable change in how much you're spending to operate.

      (I have no idea if y is actually much larger than x)

    • credit_guy 4 days ago ago

      A point of reference is that the recently announced OpenAI-Oracle deal mentioned 4.5 GW. So this deal is more than twice as big.

    • aprdm 4 days ago ago

      At large scales a lot of it is measured on power instead of compute, as power is the limitation

    • ben_w 4 days ago ago

      For a while, it's become increasingly clear that the current AI boom's growth curve rapidly hits the limits of the existing electricity supply.

      Therefore, they are listing in terms of the critical limit: power.

      Personally, I expect this to blow up first in the faces of normal people who find they can no longer keep their phones charged or their apartments lit at night, and only then will the current AI investment bubble pop.

    • leetharris 4 days ago ago

      Probably because you can't reliably predict how much compute this will lead to. Power generation is probably the limiting factor in intelligence explosion.

      • sedawkgrep 4 days ago ago

        That, and compute always goes up.

  • hooloovoo_zoo 4 days ago ago

    These $ figures based on compute credits or the investor's own hardware seem pretty sketchy.

  • pera 4 days ago ago

    10 gigawatts sounds ridiculously high, how can you estimate the actual usage? I guess they are not running at capacity 24/7 right? Because that would be more than the consumption of several European countries, like Finland and Belgium:

    https://en.m.wikipedia.org/wiki/List_of_countries_by_electri...

  • xnx 4 days ago ago

    What does this mean? "To support the partnership, NVIDIA intends to invest up to $100 billion in OpenAI progressively as each gigawatt is deployed."

    • vlovich123 4 days ago ago

      Nvidia is buying their own chips and counting it as a sale. In exchange they’re maybe getting OpenAI stock that will be worth more in the future. Normally this would count as illegally cooking the books I think but if the OpenAI investment pays off no one will care.

      • toomuchtodo 4 days ago ago

        What if it doesn't?

        • vlovich123 4 days ago ago

          Still unlikely they’d get prosecuted because they’re not trying to hide how they’re doing this and there’s no reasonable expectation that OpenAI is likely to fold. I doubt they’d improperly record this in their accounting ledger either.

        • nutjob2 4 days ago ago

          It's a good question since it's probably the 99% case.

    • patapong 4 days ago ago

      Perhaps it means OpenAI will pay for the graphics card in stock? Nvidia would become an investor in OpenAI thereby moving up the AI value chain as well as ensuring demand for GPUs, while OpenAI would get millions of GPUs to scale their infrastructure.

    • dtech 4 days ago ago

      They're investing in kind. They're paying with chips instead of money

    • mmmllm 4 days ago ago

      They will transfer the money to buy their own chips right before each chip is purchased

    • solarexplorer 4 days ago ago

      That they will invest 10$ in OpenAI for each W of NVIDIA chips that is deployed? EDIT: In steps of 1GW it seems.

    • jstummbillig 4 days ago ago

      I am confused as to what the question is.

    • losteric 4 days ago ago

      so nvidia's value supported by the value of AI companies, which nvidia then supports?

    • dsr_ 4 days ago ago

      It means this is a bubble, and Nvidia is hoping that their friends in a white house will keep them from being prosecuted, of at least from substantial penalties.

    • re-thc 4 days ago ago

      > What does this mean?

      > to invest up to

      i.e. 0 to something something

  • me551ah 4 days ago ago

    So OpenAI is breaking up with Microsoft and Azure?

    • freedomben 4 days ago ago

      They've been sleeping with Oracle too recently, so I don't think they're breaking up, just dipping a toe in the poly pool

    • FinnKuhn 4 days ago ago

      I would say Microsoft cheated on OpenAI first ;)

      https://www.reuters.com/business/microsoft-use-some-ai-anthr...

    • Handy-Man 4 days ago ago

      It was more like Microsoft refused to build the capacity OpenAI was asking for, so they gave them blessing to buy additional compute from others.

      It does seem like Satya believes models will get commoditized, so no need to hitch themselves with OpenAI that strongly.

    • mmmllm 4 days ago ago

      Are Anthropic and Google breaking up with Nvidia?

  • pyrophane 4 days ago ago

    So Nvidia is giving OpenAI money so OpenAI can buy more Nvidia GPUs?

    • 2sk21 4 days ago ago

      Telecom vendors were doing exactly this before the dotcom crash of 2000

  • TheRealGL 4 days ago ago

    Did I miss the part where they mention the 10 large nuclear plants needed to power this new operation? Where's all the power coming from for this?

    • nutjob2 4 days ago ago

      Also, the fact they they announce not how much computing power they are going to deploy but rather how much electricity it's going to use (as if power usage is a useful measurement of processing power) is kind of gross.

      "Good news everybody, your power bills are going up and your creaking, chronically underfunded infrastructure is even closer to collapse!"

    • HDThoreaun 4 days ago ago

      Build this thing in the middle of the desert and you would need around 100 sq mile of solar panels + a fuck load of batteries for it to be energy independent. The solar farm would be around $10 billion which is probably far less than the gpus cost

      • xnx 4 days ago ago

        Dissipating 10GW of heat is also a challenge in a sunny, hot, dry environment.

      • udkl 4 days ago ago

        $10 billion is small change compare to the estimated all-inclusive cost of $10 billion for EACH 500MW data center ... $200 billion for 10GW.

      • boringg 4 days ago ago

        Won't get you the necessary 4 9's uptime and energy sadly. Im still 100% for this -- but need another model for energy delivery.

      • newyankee 4 days ago ago

        100 sq km should suffice

    • hoosieree 4 days ago ago

      Also water. You will be rationed, OpenAI will not.

      https://www.newstarget.com/2025-08-02-texas-ai-data-centers-...

    • nitwit005 4 days ago ago

      I assumed this headline was not aimed at the public, but at some utility they want to convince to expand capacity. Otherwise, bragging about future power consumption seems a bit perplexing.

      • Ianjit 3 days ago ago

        Or to assuage investors participating in the OpenAI secondary on the issue of cash burn.

    • catigula 4 days ago ago

      Consumer electric grids.

      • davis 4 days ago ago

        Exactly this. This is essentially a new consumer tax in your electrical bill. The buildout of the electrical grid is being put on consumers essentially as a monthly tax with the increase in electrical costs. Everyone in the country is paying for the grid infrastructure to power these data centers owned by trillion dollar companies who aren't paying for their needs.

      • delfinom 4 days ago ago

        Yep. Consumers are screwed and $500/month electric bills are coming for the average consumer within a year or two. We do not have the electricity available for this.

        • leptons 4 days ago ago

          I'm pretty average, living in a small home, and my electric bill is already >$500/mo in the summer, and that's with the A/C set at 76F during the day.

          • thorncorona 4 days ago ago

            Where do you live? How old is your house?

            500 is insane.

            • catigula 4 days ago ago

              I don't expect him to tell you where he lives but my bill EXPLODED recently due to what I now know is data center demand.

          • t0mas88 4 days ago ago

            How many kWh is that? At those amounts solar panels seem like a no-brainer business case?

  • softwaredoug 4 days ago ago

    How will this actually be powered? Just seems like we’re powering an ecological disaster.

  • aanet 4 days ago ago

    I'm old enough to remember when vendor financing was both de rigueur and also frowned upon... (1990s: telecom sector, with all big players like Lucent, Nortel, Cisco, indulging in it, ending with the bust of 2001/2002, of course)

    • alephnerd 4 days ago ago

      This absolutely feels like the Telco Bubble 2.0, and I've mentioned this on HN as well a couple times [0]

      [0] - https://news.ycombinator.com/item?id=44069086

      • boringg 4 days ago ago

        For sure a great infrastructure build out -- lets hope the leftover are better energy infrastructure so that whatever comes next in 7 years after the flame out has some great stuff to build on (similar to telco bubble 1.0) and less damaging to planet earth in the long arc.

        • alephnerd 4 days ago ago

          Yep. The Telco Bust 1.0 along with the Dotcom Bust is what enabled the cloud computing boom, the SaaS boom, and the e-commerce boom by the early-mid 2010s.

          I think the eventual AI bust will lead to the same thing, as the costs for developing a domain-specific model have cratered over the past couple years.

          AI/ML (and the infra around it) is overvalued at their current multiples, but the value created it real, and as the market grows to understand the limitations but also the opportunities, a more realistic and permanent boo' will occur.

          • aanet 4 days ago ago

            Yeah - no doubt on the eventual productivity gains due to AI/ML (which are real, of course, just like the real gains due to telecom infra buildup), but must an economy go through a bubble first to realize these productivity gains??

            It appears that the answer is "more likely yes than not".

            Counting some examples:

            - self driving / autonomous vehicles (seeing real deployments now with Waymo, but 99% deployment still ahead; meanwhile, $$$ billions of value destroyed in the last 10-15 years with so many startups running out of money, getting acquihired, etc)

            - Humanoid robots... (potential bubble?? I don't know of a single commercial deployment today that is worth any solid revenues, but companies keep getting funded left / right)

            • Deegy 4 days ago ago

              Happened with the electrical grid too.

              I think you make a very interesting observation about these bubbles potentially being an inherent part of new technology expansion.

              It makes sense too from a human behavior perspective. Whenever there are massive wins to be had, speculation will run rampant. Everyone wants to be the winner, but only a small fraction will actually win.

  • StapleHorse a day ago ago

    I wonder if all that heat could be harvested and reused in turbines. Maybe using industrial scale heat pumps. I don't know.

  • searine 4 days ago ago

    I look forward to subsidizing this effort with my skyrocketing home power bill.

  • brendoelfrendo 4 days ago ago

    Where does this fit in with the $300 billion partnership between OpenAI and Oracle? You know, the one that also hasn't happened yet and catapulted Oracle's stock price through the stratosphere last week? Is that also getting built or is OpenAI partnering with Nvidia to get access to the GPUs that neither they nor Oracle currently own?

  • gloxkiqcza 4 days ago ago

    I’m really curious how this affects the consumer GPU market over the next few years. Sure, there has been a GPU shortage for a few years now but if this continues, there should be an absolute surplus of obsolete-gen enterprise GPUs flooding the market, right? Any ideas what limitations and benefits these cards might have for an enthusiast?

    • dcchambers 4 days ago ago

      I feel like we're lucky Nvidia even sells consumer GPUs any more. At this point it's just a distraction to them and takes away resources they could be devoting to higher value hardware.

      And the data center-class hardware doesn't do well in a home environment. It's not good for gaming. It runs hot and uses a ton of energy. Not to mention, silicon that is running hot 24/7 for years probably isn't the best thing to own second hand.

      • NooneAtAll3 4 days ago ago

        probably explains why we have "just rent H100 in the cloud, duh" influencers under every hardware building post on hn now

    • jama211 4 days ago ago

      These systems aren’t easily converted into desktop style GPU’s, so it may not trickle down the way we hope

    • wmf 4 days ago ago

      I assume surplus DGX A100 are already out there but they consume kilowatts so enthusiasts can't even plug them in.

  • Brysonbw 4 days ago ago

    The infrastructure and energy required to power these systems at scale are critical. I hope we carefully consider the environmental impact of building and operating data centers. I’m optimistic that we will develop efficient and sustainable solutions to power the data centers of today and the future

  • TheAlchemist 4 days ago ago

    Folks old enough to have been around in 2000 have seen this movie before.

    If this was such a great business, money would be coming from outside and Nvidia would be using its profits to scale production. But they know it's not and once the bubble pops, they profit margin evaporates in months. So they keep the ball rolling - this is pretty much equivalent to buying the cards from ... themselves.

  • lawlessone 4 days ago ago

    Nvidia if you're listening give me 10K and i'll bu...*invest 10K+ 10 euro worth of cash in your product.

  • zuInnp 4 days ago ago

    Yeah, who cares about the enviroment... who needs water and energy, if you AI agent can give you better pep talk

    • novaRom 4 days ago ago

      What's the purpose to have access to smart assistants if it doesn't result in improving your basic needs, not improving your quality of life? Who is spending now? Only high income households, while majority is struggling with high utility bills and grocery prices - very basic needs.

    • rlv-dan 4 days ago ago

      Don't forget about better filters for influencers talking about the climate crisis!

    • tgv 4 days ago ago

      When fucking up the human mind isn't enough. This is really villainous.

      And before you think that's nonsense, let's not forget these people are accelerationists. Destroying the fabric of society is their goal.

  • Nevermark 4 days ago ago

    There are venture capital firms.

    Nvidia is transforming into a venture GPU company. X GPUs for Y percent.

    • mxschumacher 3 days ago ago

      openAI is far past the venture stage

  • DebtDeflation 4 days ago ago

    Wouldn't Nvidia be better served investing the $100B in expanding GPU manufacturing capacity?

    • vessenes 4 days ago ago

      They’re already spending as much money as they possibly can on growth, and have no further use for cash currently - they’ve been doing share buybacks this year.

    • ecshafer 4 days ago ago

      By investing in TSMC? By buying TSMC? I don't think $100B would buy them enough current generation capacity to make a difference from scratch.

    • paxys 4 days ago ago

      The don't have to pick just one.

  • JCM9 4 days ago ago

    This is throwing more cards on the house of cards. Nvidia is “investing” in OpenAI so OpenAI can buy GPUs from NVidia. Textbook “round tripping.”

    I generally like what’s been happening with AI but man this is gonna crash hard when reality sets in. We’re reaching the scary stage of a bubble where folks are forced to throw more and more cash on the fire to keep it going with no clear path to ever get that cash back. If anyone slows down, even just a bit, the whole thing goes critical and implodes.

    • crowcroft 4 days ago ago

      It seems similar to how GE under Jack Welch would use their rock solid financials to take on low cost debt that they could lend out to suppliers who needed finance to purchase their products.

      The biggest difference here though is that most of these moves seem to to involve direct investment and the movement of equity, not debt. I think this is an important distinction, because if things take a downturn debt is highly explosive (see GE during the GFC) whereas equity is not.

      Not to say anyone wants to take a huge markdown on their equity, and there are real costs associated with designing, building, and powering GPUs which needs to be paid for, but Nvidia is also generating real revenue which likely covers that, I don't think they're funding much through debt? Tech tends to be very high margin so there's a lot of room to play if you're willing to just reduce your revenue (as opposed to taking on debt) in the short term.

      Of course this means asset prices in the industry are going to get really tightly coupled, so if one starts to deflate it's likely that the market is going to wipe out a lot of value quickly and while there isn't an obvious debt bomb that will explode, I'm sure there's a landmine lying around somewhere...

      • paganel 4 days ago ago

        > debt is highly explosive (see GE during the GFC) whereas equity is not.

        Not as explosive as debt but I'd venture to say that nowadays equity is a lot more "inflamable" compared to 2008-2010, as in a lot more debt-like (which I think partly explains the current equity bubble in the US).

        As in, there are lots and lots of investment funds/pension funds/other such like financial entities which are very heavily tied to the "performance" of equity, and I'm talking about trillions (at this point) of dollars, and if that equity were to get a, let's say, 20 or 30% hair-cut in a matter of two-three months (at most), then we'll for sure be back in October 2008 mode.

        • littlestymaar 4 days ago ago

          > As in, there are lots and lots of investment funds/pension funds/other such like financial entities which are very heavily tied to the "performance" of equity, and I'm talking about trillions (at this point) of dollars, and if that equity were to get a, let's say, 20 or 30% hair-cut in a matter of two-three months (at most), then we'll for sure be back in October 2008 mode.

          Just curious, can you detail how it would fail exactly?

          • crowcroft 4 days ago ago

            Anytime there's a massive draw down equities an asset-liability mismatch shows up (margin calls) because someone was borrowing money to spend in the short term against the value of assets that have now disappeared.

            It might not be the catastrophic cascading failure of the GFC, but someone somewhere in the pile will get exposed.

            • littlestymaar 4 days ago ago

              Ah yes I see. It's the idea that somewhere, somehow, there is debt that's funding all of this, even if it's very indirect.

        • crowcroft 4 days ago ago

          Totally agree, it might not blow up in Nvidia's face, but there's a margin call sitting around somewhere in the pile.

      • psunavy03 4 days ago ago

        GE also created their "rock-solid financials" by moving money around as necessary to make earnings projections.

      • phh 4 days ago ago

        Except I'm guessing they are not selling their equity, they are making debt backed by their equity?

        • crowcroft 4 days ago ago

          Yea this is speculation, you might be right. I'm not sure how exactly they're doing this but my thinking would be.

          1. Selling equity (probably good).

          2. Financed with actual profits over time showing up as lower margins on the income statement (probably good).

          3. Issuing debt backed by their equity (possibly a dumpster fire).

          • bwfan123 4 days ago ago

            > Financed with actual profits over time showing up as lower margins on the income statement (probably good)

            would these equity investments only impact the balance-sheet as financial investments - why would they show up as lower margins on income statement ?

    • radu_floricica 4 days ago ago

      But real GPUs are being built, installed and used. It's not paper money, it's just buying goods and services partly with stock. Which is a very solid and time honored tradition which happens to align incentives very well.

      • mgh95 4 days ago ago

        What revenues do these GPUs generate for OpenAI? OpenAI is not currently profitable, and it is unclear if its business model will ever becomes profitable -- let alone profitable enough to justify this investment. Currently, this only works because the markets are willing to lend and let NVIDIA issue stock to cover the costs to manufacture the GPUs.

        That's where the belief that we are in a bubble comes from.

        • theptip 4 days ago ago

          OpenAI is profitable if they stop training their next generation models. Their unit economics are extremely favorable.

          I do buy that they are extremely over-valued if they have to slow down on model training.

          For cloud providers, the analysis is a bit more complex; presumably if training demand craters then the existing inference demand would be met at a lower price, and maybe you’d see some consolidation as margins got compressed.

          • mgh95 4 days ago ago

            > OpenAI is profitable if they stop training their next generation models. Their unit economics are extremely favorable.

            But OpenAI can't stop training their next generation models. OpenAI already spends over 50% of their revenue on inference cost [1] with some vendors spending over 100% of their revenue on inference.

            The real cash cow for them is in the business segment. The problem here is models are rapidly cloned, and the companies adjacent to model providers actively seek to provide consumers the ability to rapidly and seamlessly switch between model providers [2][3].

            Model providers are in the situation you imagine cloud providers to be in; a non-differentiated, commodity product with high fixed costs, and poor margins.

            [1] https://www.wheresyoured.at/why-everybody-is-losing-money-on...

            [2] https://www.jetbrains.com/help/ai-assistant/use-custom-model...

            [3] https://code.visualstudio.com/docs/copilot/customization/lan...

            • theptip 3 days ago ago

              I agree the market dynamics are weird now, I disagree that says much about the existence of other equilibria.

              For example, inference on older GPUs is actually more profitable than bleeding-edge right now; the shops that are selling hosted inference have options to broaden their portfolio the advancement of the frontier slows.

              Cloud providers are currently “un-differentiated”, but there are three huge ones making profits and some small ones too. Hosting is an economy-of-scale business and so is inference.

              And all of these startups you quote like Cursor that are not free-cash-flow positive are simply playing the VC land grab game. Costs will rise for consumers if VCs stop funding, sure. That says nothing about how much TAM there is at the new higher price point.

              The idea that OAI is un-differentiated is just weird. They have a massively popular consumer offering, a huge bankroll, and can continue to innovate on features. Their consumer offering has remained sticky even though Claude and Gemini have both had periods of being the best model to those in the know.

              And generally speaking there are huge opportunities to do enterprise integrations and build out the retooling of $10T of economic activities, just with the models we have now; a Salesforce play would be a natural pivot for them.

              • mgh95 3 days ago ago

                > Cloud providers are currently “un-differentiated”, but there are three huge ones making profits and some small ones too. Hosting is an economy-of-scale business and so is inference.

                Anybody who has worked in a compliance heavy segment (PCI-DSS, HIPAA, etc.) will tell you the big 3 clouds have very significant differences from the smaller players. The differentiation is not on compute itself, but on the product. It's partially why products like AWS Bedrock exist and are actively placing model providers both in competition with eachother and AWS itself which is exactly the market dynamic they should seek to avoid.

                > The idea that OAI is un-differentiated is just weird. They have a massively popular consumer offering, a huge bankroll, and can continue to innovate on features. Their consumer offering has remained sticky even though Claude and Gemini have both had periods of being the best model to those in the know.

                This is exactly where this line of reasoning goes off the rails. The consumer market is problematic (see the recent post about the segment its growing in; basically young women of limited spend in low income countries); a huge bankroll is also a huge liability, model providers are on a clock to get huge or die, and the innovation we are seeing is effectively attempting to "scale-up" models, not provide novel features.

                > Their consumer offering has remained sticky even though Claude and Gemini have both had periods of being the best model to those in the know.

                This isn't a good thing with current market mix.

                > And generally speaking there are huge opportunities to do enterprise integrations and build out the retooling of $10T of economic activities, just with the models we have now; a Salesforce play would be a natural pivot for them.

                Do you have any indication these are achieving buy in or profitable? Most significantly, we have seen a recent study by MIT that 95% of generative AI pilots fail. The honeymoon period is rapidly coming to a close. Tangible results are necessary.

            • Workaccount2 4 days ago ago

              That's why we are seeing these insane numbers. The competition is "do or die" right now.

              Zuckerberg said in an interview last week he doesn't mind spending $100B on AI, because not investing carries more risk.

              • mgh95 4 days ago ago

                This only applies if you think one of two things; First, that it is guaranteed that this specific line of inquiry will lead to development of a form of superintelligence or otherwise broadly applicable development; or second, the form of machine learning technologies that unlocks or otherwise enables a market which would otherwise be inaccsesible that justifies this investment.

                To date, no evidence of either even exists. See Zuckerbergs recent live demo of Facebooks Ray Bans technology, for example.

        • davedx 4 days ago ago

          OpenAI generates plenty of revenues from their services. Don't conflate revenues with profits

          • mgh95 4 days ago ago

            I don't believe I am. Investors (value investors, not pump and dump investors) provide capital to companies on the expectation of profit, not revenue.

            • charcircuit 4 days ago ago

              Sure, and as long the expected profit keeps increasing investors are happy. They don't need to make an actual profit yet.

        • crowcroft 4 days ago ago

          The counter point to this is that while not profitable, the cashflow is real, and inference is marginally ROI positive. If you can scale inference with more GPUs then eventually that marginal ROI grows large enough to cover the R&D and other expenses and you become profitable.

          • mgh95 4 days ago ago

            "Marginally ROI positive" works in a ZIRP environment. These are huge capital investments; they need to at least clear treasury return hurdles and importantly provide attractive returns.

            I am fundamentally skeptical of "scaling inference". Margins are not defensible in the market segment OpenAI is in.

            • crowcroft 4 days ago ago

              For some of these tech companies their valuations let them go to the market with their equity in way that is basically a ZIRP environment. In a way you could say this is a competitive advantage someone like Nvidia has at the moment and so they are trying to push that.

              I'm also pretty skeptical, and could imagine this whole thing blowing up, but it's not like this a big grift that's going to end up like the GFC either.

              • mgh95 4 days ago ago

                I think it's possible we are in datacenter GPU overcapacity already, and NVIDIA is burning its stock to avoid the music stopping.

                It's already happening in China that datacenters are at GPU overcapacity. I wouldn't be surprised if it occurs here.

        • cluckindan 4 days ago ago

          Wow, diluting stock during a bull run is incredibly short-sighted. NVIDIA is betting there will never be a downturn. If there is, the dilution causes late investors to either be left holding the bag or be forced to sell (potentially at a loss), meaning the stock has the potential to drop like a stone at the first sign of trouble.

          I guess that’s why they would be gaming their numbers: to convince the next greater fools.

        • humanizersequel 4 days ago ago

          They're doing about a billion per month in revenue by running proprietary models on GPUs like these. Unless they're selling inference with zero/negative margin, it seems like a business model that could be made profitable very easily.

          • mgh95 4 days ago ago

            Revenue != profit, and you don't need to become net negative margin to be net unprofitable. Expensive researchers, expensive engineers, expensive capex, etc.

            Inference has extremely different unit economics from a typical SaaS like Salesforce or adtech like google or facebook.

            • humanizersequel 4 days ago ago

              All of those expenses could be trimmed in a scenario where OpenAI or other big labs pivot to focus primarily on profitability via selling inference.

              • mgh95 4 days ago ago

                Currently, selling LLM inference is a red queen race: the moment you release a model, others begin distilling and attempting to sell your model cheaper, avoiding the expensive capitalized costs associated with R&D. This can occur because the LLM market is fundamentally -- at best -- minimally differentiated; consumers are willing to switch between vendors ("big labs", as you call them, but they aren't really research labs) to whomever offers the best model at the lowest price. This is emphasized by the distributors of many LLMs, developer tools, offering ways to switch the LLM at runtime (see https://www.jetbrains.com/help/ai-assistant/use-custom-model... or https://code.visualstudio.com/docs/copilot/customization/lan... for an example of this). The distributors of LLMs actively working against LLM providers margin provides an exceptionally strong headwind.

                This market dynamic begets a low margin race to the bottom, where no party appears able to secure the highly attractive (think the >70% service margin we see in typical tech) unit economics typical of tech.

                Inference is a very tough business. It is my opinion (and likely the opinion of many others) that the margins will not sustain a typical "tech" business without continual investment to attempt to develop increasingly complex and expensive models, which itself is unprofitable.

                • humanizersequel 4 days ago ago

                  I don't disagree but you're moving the goalposts. I never said that they could achieve the profits of a typical tech business, just that they could be profitable. Also, the whole distilling problem doesn't happen if the model is proprietary.

                  • mgh95 4 days ago ago

                    > I don't disagree but you're moving the goalposts. I never said that they could achieve the profits of a typical tech business, just that they could be profitable. Also, the whole distilling problem doesn't happen if the model is proprietary.

                    In the absence of typical software margins, they will be eroded by providers of "good enough" margins (AWS, Azure, GCP, etc.) who gain more profit from the bundled services than OpenAI does from the primary services. This has happened multiple times in history, either resulting in smaller businesses below IPO price (such as Elastic, Hashicorp, etc.) or outright bankruptcy.

                    Second, the distilling happens on the outputs of the model. Model distillation refers to the usage of a models outputs to train a secondary smaller model. Do not mistake distillation for training (or retraining) to sparse models. You can absolutely distill proprietary models. In fact, that is how DeekSeek-R1-Distill-Qwen and the DeepSeek-R1-Distill-Llama are trained. This also happens with Chinese startups distilling OpenAI models to resell [2].

                    The worst part is OpenAI is already having to provide APIs to do this [1]. This is not ideal, as OpenAI wants to lock people into (as much as possible) a single platform.

                    I really don't like OpenAIs market position here. I don't think it's long term profitable.

                    [1] https://openai.com/index/api-model-distillation/

                    [2] https://www.theguardian.com/technology/2025/jan/29/openai-ch...

            • mrandish 4 days ago ago

              > Revenue != profit

              Indeed. And even if that revenue is net profitable right now (and analysts differ sharply on whether it really is), is there a sustainable moat that'll keep fast-followers from replicating most of OpenAI's product value at lower cost? History is littered with first-movers who planted the crop only to see new competitors feast on the fruit.

          • AlexandrB 4 days ago ago

            And even if they are selling inference at negative margin, they'll make it up in scale!

            • kapone 4 days ago ago

              These kinds of phrases are...eerily similar to the phrases heard right before...the .com bust. If you were old enough at the time, that's exactly what the mindset was back then.

              The classic story of the shoeshine boy giving out stock tips...and all that.

              We all know how that turned out.

        • empath75 4 days ago ago

          Amazon lost money every year for the first 9 years of it's existence and people said it was a bubble the entire time.

          • mgh95 4 days ago ago

            Amazon was gross margin profitable -- and significantly so -- the entire time.

            It just turns out they were a server farm subsidizing a gift shop.

            • rhetocj23 3 days ago ago

              Yeah this.

              Ultimately the marketplace was just an investment that had embedded within it a real option for AWS. Magical really.

        • radu_floricica 3 days ago ago

          > OpenAI is not currently profitable, and it is unclear if its business model will ever becomes profitable -- let alone profitable enough to justify this investment.

          Well, yes. Which again is how venture capitalism has worked for ... is it decades or centuries? There is always an element of risk. With pretty solidly established ways to handle: expected value, risk mitigation etc.

          I haven't lived through the dot com bubble (too young) but i've read about it. The absolutely insane ways they were throwing money at startups were... just insane. The potential of the technology is the same now and then: AI vs Internet. It wasn't the tech that failed the last time, it was the way the money was allocated.

          The math is actually quite mathing this time around. Most AI companies have solid revenues and business models. They aren't turning a profit because (like any tech startup) they chose to invest all their revenue plus investments into growth, which in this case is research and training new models. They aren't pivoting every 6 months, aren't burning through cash reserves just to pay salaries, and they've already gone through train/deploy cycles several times each, successfully.

          Are they overvalued? shrug that's between them and their investors, and we'll find that out eventually. But this is not a bubble that can burst as easily as last time, because we're all actually using and paying for their products.

      • dismalaf 4 days ago ago

        No one's implying it's fake money or resources, only that will no clear path to profit eventually the money will stop flowing and valuations will implode.

        • sharpshadow 4 days ago ago

          It’s a global AI race, there is more at stakes than profit.

          • dismalaf 4 days ago ago

            There was also a global AI race in the 80's...

      • kimixa 4 days ago ago

        But GPUs are a depreciating asset - if there's a bubble burst and your 5 million GPUs are idle for the next few years before demand picks up again, they'll be pretty outdated and of limited use.

        Infrastructure tends to have much longer lifetimes. A lot of the telco infrastructure "overbuilt" during that boom is still used today - you can always blow new fibre, replace endpoints and all that without digging everything up again, which was the largest cost in the first place. Sure, in the above example you'll still the datacentre itself (and things like electricity connections and cooling) that can be reused, but that's a relatively small fraction of the total cost comparitively.

      • belter 4 days ago ago

        > But real GPUs are being built, installed and used.

        At this moment they could as well be called bitcoin or tulips....No different from Chinese ghost towns. Real houses being planned and built... And let's not talk to accountants about the depreciation rates on GPU Hardware that is out in 8 to 12 months...

    • Jayakumark 4 days ago ago
      • tobias3 4 days ago ago

        Same thing with the 1.3 billion EUR investment of ASML into Mistral. ASML -> Mistral -> NVIDIA -> TSMC -> ASML -> ...

      • paxys 4 days ago ago

        It would be amusing if it also wasn't so accurate.

        • lotsofpulp 4 days ago ago

          I didn’t see the step where Larry has to sell any stock, and hence puts downward price pressure on Oracle share prices.

          What is the source of the cash in steps 3, 4, and 7?

          • Lalabadie 4 days ago ago

            He doesn't have to sell. He can finance the deal with debt backed by his newly risen stock as collateral. Then the debt is used to further inflate the price of the stock.

            The flywheel metaphor is pretty apt.

          • mcny 4 days ago ago

            It is us, index fund owners :clown:

            Disclaimer: I also have a small amount of money in vanguard IRA

            • lotsofpulp 4 days ago ago

              According to the image of the steps, Oracle’s share price is going up, presumably more than it would have without engaging in these steps. How can that cost index fund owners? They would be benefiting from the share price increase.

          • truelson 4 days ago ago

            Ultimately, debt will fuel this. Oracle can't pay with cashflow.

          • wmf 4 days ago ago

            Credit.

    • truelson 4 days ago ago

      Going to leave this link here: https://www.hussmanfunds.com/comment/mc250814/

      By many different measures, we are at record valuations (though must be said, not P/E however). Tends not to end well. And housing prices are based on when mortgages were at 3% and have not reset accordingly. We are in everything bubble territory and have been.

      • mrandish 4 days ago ago

        > Tends not to end well.

        I'm no financial guru but this time around the boom/bust cycle, there's a new, additional factor that's concerning. Even though I sold my individual tech company shares a few years ago and diversified all my equity holdings in broad market ETFs like VTI, the so-called "Magnificent 7" tech companies have inflated so much, they now occupy a disproportionate percentage of even broad market ETFs which hold ~5,000 stocks based on their market caps. The obvious issue being their share prices all having a significant component elevated by the same thing - unrealistic AI growth expectations.

        • kapone 4 days ago ago

          Two words. Passive flows.

          Where do you think your 401K money is going...right into the S&P 500...and who gets the lion's share of allocation out of that? The Mag7 et al.

          If you chart the last 25 years, Gold (yes, that one...the useless metal) has outperformed the S&P (and it's making new highs even today). What does that say about hard assets vs these companies?

      • psunavy03 4 days ago ago

        Housing prices have not reset because of supply and demand. People are sitting on those 3 percent mortgages and not selling.

      • rapsey 4 days ago ago

        People are quite bearish and the stock market is making all time highs. This is actually a very good sign, because we are far from any euphoria.

        Always keep in mind the old saying: pesimists get to be right and optimists get to be rich.

        • thfuran 4 days ago ago

          Only if they're optimistic at the right time and not the wrong one.

          • rapsey 4 days ago ago

            Timing the market is a fallacy. Time in the market is what builds wealth.

            • thfuran 4 days ago ago

              You're treating a statistical tendency as immutable law. It's true that attempting to time the market is not generally a good investment strategy, but every investment is made at some time, and some of those are very bad times to put money in the market. That it'll probably recover eventually doesn't much help if you've lost everything in the interim.

              • rapsey 4 days ago ago

                Which is why you invest a percentage of your paycheque not wait to time the market.

        • Imustaskforhelp 4 days ago ago

          That quote definitely has some insane survivor bias in it.

          Optimists go bankrupt or something and you blame them on their work ethic or something and you discard any of those optimists who didn't really succeed and cherry pick those optimists which went right...

          Its a classic survivorship bias.

          I am pessimistic in US stocks because they are so concentrated on AI for returns and its definitely a bubble or approaches its territory, there is somewhat no denying about it from what I observe.

          Your comment really is just off putting to me because I feel like its just a copium which is going to be inhaled by the new generation and then if we fail which lets be honest failure is a natural part of life, we are gonna blame ourselves and that's just really depressing.

          I'd better be right than rich. Maybe my rich definition is something that I can get out of hard work while maybe being pessimist (just enough money to have freedom lol)

          I don't want to make billions or hundreds of millions, i don't want to build a vc funded disaster for humanity in the name of shareholders whether its an Ad dystopia or an AI nightmare fuel, I'd rather make a imprint on humanity other than my bank account number but maybe that's me being "optimistic"

          Sorry but your comment truly ragebaited me... I have very strong opinions in this regards.

          • rapsey 4 days ago ago

            > I am pessimistic in US stocks because they are so concentrated on AI

            The russel 2000 index just made an all time high. The bull market is diverse and global. Indexes of many countries are also at all time highs.

            • Imustaskforhelp 4 days ago ago

              I am pessimistic in US S&P 500 for the most part actually given how concentrated it is in AI (refer to that hank green video)

              I also didn't know that the other world's stocks are doing fine actually. but maybe there is a difference in economy and stocks at this point...

              I believe that we can all surely agree on the legendary john bogle's philosophy and in the current day and age realize that us s&p stocks are too centralized on ai and world stocks can be better...

              Regarding russel 2000 index. I feel like a lot of money trickles down from the AI hype but its honestly great that russel is doing great.

              The point I am trying to make is that atleast for US right now, its political system is so shaky that I can't trust its economical system and there is no denying that if the AI bubble bursts, then it would bleed the whole economy at this point including russel.

              There was a great hank green video which I recommend about this concept https://www.youtube.com/watch?v=VZMFp-mEWoM

              Also, A lot of countries are definitely in turmoil right now so I am actually surprised by your statements that world economy is doing quite high, maybe stock markets are just another asset class which have gotten so inflated that they are out of touch from the ground reality... (Something I heard in an atrioc video)

              I am definitely a bit surprised to hear that the world stocks are doing fine from all the bloodbath of tarrifs and some political issues the world is facing right now...

              • rapsey 4 days ago ago

                Politics is a distraction and largely irrelevant to investing.

                The stock market has so much money going into it that it is in a bull market. Because people have nowhere else to put their money into (real estate is dead atm).

                You are letting your political biases poison your financial decisions.

                • Imustaskforhelp 3 days ago ago

                  It isn't even a political bias but rather we can't deny that the economy feels like kissing the ring whether its us buying intel stocks or sort of forcing nvidia to buy some intel stocks and etc.

                  And I feel like its in a bull market because of AI Hype which was the main comment of the original parent to which you responded I think...

                  If this AI hype fails to deliver. Literally the magnificient 7 will have a huge loss of money which would then make the stockholders feel less wealthy which will spend less and it would have a drastic impact in the WHOLE economy.

                  Yes its in a bull market but I feel like I don't want to find out if I am in the peak of a bull market for an AI craze y'know?

                  And I am not advocating against stocks omg, I am just saying that world stocks are better in current landscape and I doubt if its poisoning my financial decisions.

                  NO I Don't want all of my saved money to go into an index which is going to be heavily dictated by the future of AI which I and many presume to be a bubble. I would much rather invest in index funds that target the world, heck maybe even index funds that target every country ex usa

                  My point is that the bubble will burst and then atleast S&P / nasdaq will definitely bleed.

                  Either we can talk about if you think its a bubble or not, since I am not comfortable investing in a bubbly situation no matter how lucrative it becomes y'know?

                  What are your thoughts on it?

                  • rapsey 3 days ago ago

                    You can find excuses not to invest at any time. Easiest thing in the world has always been finding an excuse not to invest.

                    Mag7 are some of the most profitable and well run companies in history investing their insane profits.

                    No other country has public markets as developed, regulated and liquid as the US. Likely you are just investing into the unknown with a ton of risk factors you are not aware of. In places outside of the US politics actually is a significant factor in investing.

                    • Imustaskforhelp 3 days ago ago

                      Okay so I appreciate your comment once again. I hope that this discussion can happen in good faith and lets really continue it as I think that I can learn something new.

                      I can be wrong, I usually am.

                      That being said, My question to you is:

                      Do you believe that it is an excuse if I don't invest in mag7 while they are most profitable and well run because I believe that their stock price is highly overflated and past performances aren't indicative of future performances unless we are talking over an aggregate time which the general markets do have.

                      Now the question is, Do you think its an excuse if I don't want to invest in mag7 because I am worried that its an AI bubble and that worry is backed up by the fact of this AI craze.

                      If AI doesn't deliver on its prices, can you wager that MAG7 would actually do good? Of course it wouldn't.

                      What do you mean to think that AI would deliver to its prices as it seems to be either only hyper applicable in tech and all other AI tech is seemingly run at a loss and I can see no way how they might force normal users when there is so much foss ai to actually pay for ai...

                      What is the monetization plan? Is it to churn the money that you get from stocks into AI to get a higher evaluation of stocks and do some passing around the circle from one company to other and repeat?

                      Well run is another questionable term given how Magnificient7 includes tesla but maybe we can talk about it later.

                      I believe that time in the markets beats timing in the markets, so your experiences shouldn't be with a market that feels bubbly y'know? Otherwise, you might just stop it alltogether and I feel like that things might fall down quicker than we think as AI is kinda scrambling through, A lot of people felt disappointed in gpt-5. Reality is settling in, but is reality settling in those magnificient 7 stocks?

                      I consider myself to be an average investor in the sense that being a superior investor is insanely insanely difficult and its much easier to think you are a superior investor because you might get lucky and then lose more money than you could've made over a long term of time and you try to recoup previous money and previous money.... I definitely don't want to experience it in first place to keep my experiences somewhat moderate y'know?

                      This is HN so I presume you don't get bored with this response as I love this talk & trying to understand your point in good faith!

                      • rapsey 3 days ago ago

                        > I believe that their stock price is highly overflated and past performances aren't indicative of future performances

                        You are confusing a popular, cover my ass legal statement with market truth. Past performances absolutely are indicative of future performance the vast majority of the time. They are of course not a guarantee. Inflated price is also not a particularly good indicator of future performance. A stock generally has a high valuation for a reason.

                        > What do you mean to think that AI would deliver to its prices as it seems to be either only hyper applicable in tech and all other AI tech is seemingly run at a loss and I can see no way how they might force normal users when there is so much foss ai to actually pay for ai...

                        Google, Microsoft and others run real world AI and I doubt it is at a loss. They make a ton on money on infrastructure. OpenAI operates at a loss, but it is a private company.

                        > I feel like that things might fall down quicker than we think as AI is kinda scrambling through, A lot of people felt disappointed in gpt-5. Reality is settling in, but is reality settling in those magnificient 7 stocks?

                        You consider yourself to be an average investor, yet you disagree with the market, thus you think you are smarter than the market. This is cognitive dissonance. The market is a public consensus of the future. Stocks that are more valued have a higher price, because people are willing to bet money they will do better in the future.

                        This is not toolip mania, or even the dotcom bubble. Bull markets are always caused by investment cycles. Before AI it was mobile and cloud. Those were not bubbles. Neither is AI, because the real world usage is undeniable. The user growth trajectory of ChatGPT was unprecedented. Google deepmind founders got a nobel prize for their work, for something that happened just a few year prior, but was so groundbreaking it deserved it.

                        Also I am not some investing guru, I just listen to some great investment podcasts. The Real Eisman Playbook (Steve Eisman is the person portrayed by Steve Carell in The Big Short) and Compound and friends.

                        • Imustaskforhelp 2 days ago ago

                          Hm I appreciate it but a genuine question:

                          It seems that we aren't agreeing on if the market is in a somewhat bubble or not.

                          You say that real usage is undeniable. But to me its undeniable because its being spoon fed to you for free for SOTA models from all fronts including open source chinese models.

                          They are running at a loss because they are having these insane growth cycles but they have no moat to a somewhat degree.

                          Tell me how OPENAI or any AI company plans to be profitable and actually return great profits on what the investment is.

                          The thing is, that they have to constantly train and retrain the models to reach SOTA and people are realizing that they are being benchmaxxed.

                          Open source models are coming to a somewhat close degree and I doubt that it would be thaaat noticable for most consumers y'know?

                          There is no moat. Sure, maybe there is some moat in coding as I feel like that is the only thing that wasn't touched by Open source models.

                          Open source has sort of SOTA image models, SOTA-ish video models and what not & so anybody can try to compete with these on things like open router which is where half the api uses become because of how convoluted other apis are and how openrouter just sorta works...

                          I can provide you sources as well but there is a long consensus that AI doesn't really help in research thaat much.

                          The point is, that sure there is this great tech but its just unprofitable at the scale if you consider providing free access to the masses too.

                          Tell me how these companies are gonna make a consistent profit on AI without being crunched by each other's SOTA benchmaxxing and kill throat competition from China's open source models.

                          I genuinely wonder what "real world AI" to you is & how its turning up at a profit.

                          Like, okay, maybe I can agree that sure maybe inference could be made profitable if done to somewhat degree like how deepseek did but there is no way that it was worth the return in investment...

                          And do you know what happened? Nvidia selling the shovels, "infrastructure" got to be the most valuable company. If this isn't a bubble then why did Nvidia lose so many billions of $'s just because China released deepseek model.

                          Sure nvidia has regrown but are you really not going to take the past into account?

                          Regarding past performance quote, I think that I had also agreed in my original quote but I had mentioned past performance of something like 100 years. Computer stocks have been less time than that and this AI hype is quite new.

                          These companies like google etc. are integrating everything AI not because they want to but because their stock rises up when they mention AI for the most part.

                          I will repeat this again, my friend, but if you can tell me how the average investor is investing into a business which is going to make a profit...

                          How are they going to make a profit given the amount that they have invested in with degrees of no moat, it seems that entreprise is the most clear moat they have but https://www.forbes.com/sites/jaimecatmull/2025/08/22/mit-say...

                          Coding models might be the most profitable imo given that people want absolute best in it and they don't mind paying the price (claude code) but that is a niche of niche and that alone can't justify the amount of investment and stock prices made I suppose, not unless you believe in some sort of AGI.

                          How are these companies going to make a profit dude, the only way they have been for now is by their stock prices but I know that you know that it isn't sustainable, thus it becomes a sort of bubble situation.

                          I am an average investor, yet I am cautious of the times here, because I believe that AI just kinda came out of no where and became a mainstream word and VC's were funding things like devin which was literal BS LMAO but the amount of fear mongering there was, was crazy. So like, there was a fomo of more VC's which invested in more AI's which then made people jump into the trend to then have a scene where anything labelled with AI gets stock price to

                          Am I false in the above statement?

                          How is this not a bubble? The tech is cool but people aren't paying in stock markets to support a tech or smth, they want returns now... And once those returns stop coming in the sense that people realize this... Oops, looks like nobody want those Ai stocks anymore.

                          I have read the intelligent investor to a somewhat degree to then pick up on John bogle's index fund related book to realize that benjamin graham, the creator of intelligent investor would've also preferred index funds and thus my whole sentiment shifted towards realizing diversification and to maybe preventing bubbles I suppose.

                          Honestly, so funny because your statements could be shown in history as what people believed before the bubble burst and it would still be accurate and mine tbh can also be taken in that intepretation from the other way...

                          I hope you are still interested as I still love this discussion!

                          • rapsey 2 days ago ago

                            You are conflating OpenAI, xAI, Anthropic with the entire field of AI. They are spending billions of private money with the goal of actual general AI. Maybe they will reach it, maybe they wont. They are doing a moonshot and pushing the field forward and they have the money to do it.

                            But that is an entirely different game compared to what AI is being used for now. Two random examples I came across:

                            https://x.com/LinusEkenstam/status/1965014479760204118

                            https://abelpolice.com/

                            https://longevity.technology/news/new-ai-tool-demonstrates-p...

                            This is AI as it is capable now, solving real life problems and making industries more efficient. This is happening throughout practically every field of human endeavor, which is why ChatGPT is used so much. Medicine, biomedicine, law, translations, coding, investing, learning and so on.

                            nVidia is the most valuable company in the world right now, because they are powering practically all of it.

                            Worrying about profits right now is an entirely wrong thing to concentrate on. Analogous to the previous investment cycle: Youtube is one of the most valuable pieces of the tech industry. It was a money loser for a damn long time and would have gone broke if google had not bought them, not to mention being sued out of existence (a real threat in the early days). When it was made in 2006 it was a bet everyone thought was insane, because of the infrastructure costs and legal risks. Right now it is very profitable, because they had time to optimize and develop their business model.

                            • Imustaskforhelp 2 days ago ago

                              Exactly my point. I agree with your statement tbh Really great that we can reach to a conclusion but

                              Here's the thing though:

                              Youtube has a moat. It is a social media and the networking effect runs wild on it and tbh there were a lot of other things too like (vines?) which fall.

                              But, can you say the same for Ai given open models?

                              China couldn't create an alternative to social media (in some sense?) because it requires a network effect.

                              But it sure can use gpu's, maybe even build their in house gpus so that they can then train on the data just as how america did and effectively price dump with no restrictions :/

                              Honestly, I can agree if you believe that AI Has a moat similar to social media, then sure, but I just don't believe it has a moat.

                              Youtube turned profit because of moat, Is there any moat in LLM's?

                              And if we are talking general purpose robotics/ automation, then I agree that yes its good.

                              But for an average investor whose investing, they are investing thinking that its sort of inevitable actual general AI when that's not the case.

                              From what I know, the optimizations of LLM's don't really apply to robotics, so all this funding of billions going into LLM only to pivot into robotics is a bit :/ for the investors.

                              IMO When I mentioned S&P AI stocks, that's exactly things like Google,microsoft,amazon which are still similar to OpenAi and anthropic, don't you think?

                              S&P's growth is heavily based on the calculated return that Google,microsoft,amazon are gonna be the winners of the Ai "wars", that's what I meant!!

                              If google says a line similar to yours that LLM's aren't the future, then you can naturally expect how the market would react.

                              The funny thing is, is that between your comment, I got recommended a video about AI bubble... which is accused in comments to be created by AI

                              https://www.youtube.com/watch?v=37aUuoRyMhM

                              The tech is cool but 95% are focused on the wrong thing or smth and there is no advantage/moat and uh its still literally something like. bubble. Even in a bubble, google/amazon survived.

                              You can say that I should still invest because stock prices grew even after bubble bust, but they were in a deep awakening, and I feel like as an average investor I'd rather prefer some more stability knowing that there is still a condition of a bubble formation in S&P and US tech stocks atleast

                              These companies are using AI as a magic word. Vercel's keynote had AI esque words 42 times... LET ME REPEAT, 42 times. Vercel isn't even that AI based lol, its a react next app thingy for most people.

                              Still hoping you can comment! I was thinking of creating a hackernews post about involving other people in this discussion since at the day our discussion boils down to: is this a bubble?

                              I thought that it was common knowledge to everybody but maybe not, I can create a ASK HN: Do you think that S&P 500 / Magnificient 7 is an AI bubble right now? or smth!

                              Looking forward to your feedback and I had a blast in this conversation! Wish to discuss more lol!! Have a nice day, (waiting for your comment)

                              • rapsey a day ago ago

                                > From what I know, the optimizations of LLM's don't really apply to robotics, so all this funding of billions going into LLM only to pivot into robotics is a bit :/ for the investors.

                                AI wave is more than just LLMs. Movement autonomy for cars/robots, image/video generation, protein folding, etc. Those are not LLM based AI applications. They are all downstream from the transformer architecture. Autonomy AI development is the missing piece of robotics, which is why so many billions are being invested now.

                                The lack of moat regarding LLMs is a problem only to those playing in that field, but their actual goals are not just running LLMs, they are like I said aiming for actual general intelligence.

                                In the mean time, companies are training or optimizing their own models for their use cases, like the ones I listed in the previous reply. They do have a moat, because they require specialized knowledge to play in that field. Even the abel police guys, their competitors were just an interface to ChatGPT and it worked abysmally.

                                > IMO When I mentioned S&P AI stocks, that's exactly things like Google,microsoft,amazon which are still similar to OpenAi and anthropic, don't you think?

                                Absolutely not. OpenAI is a private company spending insane billions for a moonshot project. The public S&P500 companies are investing their insane profits and making a return on those investments. Their infrastructure and scale is a moat.

                                • Imustaskforhelp a day ago ago

                                  I had written a original draft of an reply but this conversation is getting really interesting and wanted to write it again lol.

                                  I agree with all the aspects of protein folding / general purpose automation due to "AI" and not LLMs but that was happening before the "AI" hype thanks to OpenAI / chatgpt where so much money was flowing into it...

                                  And I have no issues with them if their prices were baked into realism that they were baked into pre 2022's / whenever chatgpt got launched

                                  My biggest issue which is the crust of this discussion might be that I believe that tech stock prices are roaring so high mostly because of the AI hype that they bring which raises their prices.

                                  Oracle made larry Ellison the richest person for some time due to the stargate project / due to their deal with OpenAI / then larry invested into openai for some hundred billion $ which openai paid back to oracle and oracle's stock price increased more... rinse and repeat?

                                  The thing is, why is oracle which I think is S&P company raising because of their deals in LLM's at an unastronomical rate/ unprecedented rates.

                                  Google/meta/microsoft/amazon are also all integrating AI into their every project / mentiniong AI as much as possible which lets be honest again, is mostly LLM's for the most part.

                                  Yes I know, google has some really interesting non LLM AI projects which I know and love but they were pre 2022 and google's price wasn't as much dictated by those projects as they are doing now y'know?

                                  My conclusion is that A lot of people can't / couldn't invest into OpenAI / thus flowed their money into anything LLM / AI related in the markets... & the companies are loving this and mentioning AI as much as possible

                                  Can we agree on this or not?

                                  I can agree if you think that these companies are investing into infrastructure but that infrastucture is now mostly GPU's which are only really useful for LLM related tasks and becomes redundant for general purpose stuff like running servers for the most part.

                                  Do we agree on this or not?

                                  Also regarding infrastructure, The thing is, That most of them are just packing Nvidia Gpu's which is something that Nvidia also offers and others could do too but yeah, I can get that part but is it an "investment" is questionable...

                                  Its an investment only if LLM's turn out to be profitable.

                                  Firstly the cut throat competition means that literally everyone is competing in it so it cuts each other profits.

                                  Secondly, there are some recent models which are kinda small and could run to a somewhat degree on modern hardware if need be which could satisfy some users needs without having to need that infrastucture

                                  Then again, even if there are some people that might not have that and they search on things like chatgpt. They do it out of freebies that its not gonna cost them that much. And they can switch out if those AI providers do charge them first... with open source models while they themselves ride this end of AI hype.

                                  If you believe that AGI is near, whatever that means, then literally everything I said gets out of the equation but I am assuming you aren't believing that.

                                  Now sure there are gonna be returns but they aren't gonna be nearly as expected. In fact I think that most S&P companies are gonna be in a loss with all of these training of models / building infrastructure. Also, training of models is a recurring cost for the most part if they have to stay SOTA iirc with higher developer's cost working in AI/ML (100 million$ income is provided by the people investing into S&P dude)

                                  So with all of these things, I believe that there is a legitimate concern that the investment isn't worth the return.

                                  Then why are companies investing?

                                  Because of fomo. When the AI hype started thanks to chatgpt. every private equity rushed for similars and that kinda leaked into S&P companies which are doing the same thing with AI hype mentioning it so much.

                                  Do you agree?

                                  If you can agree with all 3 of these statements to a reasonable degree, then I believe that we can agree that we are in an agreement and that it isn't much of an investment as its a way to somehow increase their stock prices by essentially mentioning the word AI and that's all that matters to them in the end, but its all on a proposition that someone is gonna buy the stocks thinking that its gonna go up and so on and so on.. when fundamentally the business model is kinda messed up when you think about it y'know? This is a bubble to me in my definition of it when people are investing into things without caring about things for the most part / logically I suppose...

                                  If we have any disagreements, do let me know so that I can maybe lighten up on some other points as I love talking lol. I am loving it although I feel like I write realllly long sentences but hey, I am writing this to really explore why I believe the things the way I do and if you can convince me then sure, I can be wrong, I usually am.

                                  Have a nice day and looking forward to your next comment!

                                  • rapsey 5 hours ago ago

                                    > My biggest issue which is the crust of this discussion might be that I believe that tech stock prices are roaring so high mostly because of the AI hype that they bring which raises their prices.

                                    They are not that high at all, at least nowhere near bubble territory according at least to the financial analysts I follow. A better metric than simply p/e is looking at forward p/e because the current price reflects their forward guidance.

                                    > My conclusion is that A lot of people can't / couldn't invest into OpenAI / thus flowed their money into anything LLM / AI related in the markets... & the companies are loving this and mentioning AI as much as possible

                                    I guess, but that in itself does not mean it is a bubble.

                                    > I can agree if you think that these companies are investing into infrastructure but that infrastucture is now mostly GPU's which are only really useful for LLM related tasks and becomes redundant for general purpose stuff like running servers for the most part.

                                    Recommendation algorithms run on GPUs, which is a huge part of any social network (like meta and tiktok). Like I said there is more than LLMs and those need to run on GPUs as well. They also provide a service to rent out GPUs to other companies to run their own models and make a very good business of it.

                                    > when fundamentally the business model is kinda messed up when you think about it y'know?

                                    You are alone in that opinion. These are some of the most profitable companies in history, which is why they make such a huge part of the S&P. You are talking about a feedback loop of investing, which is normal in any investment bull cycle. It can turn into a bubble and we may be at the start of one, but being an AI skeptic investor just means not participating and having poor returns. The future is uncertain and it sounds to me like you are looking for reasons not to invest.

                                    • Imustaskforhelp 3 hours ago ago

                                      Recommendation algorithms run on GPU's but they were running on GPU's previous to the AI hype and they are still running while running AI inference/training so those datacentres for AI are still gonna be vacated if demand drops down and so it isn't definitely something of an investment

                                      These companies have sort of saturated their markets and thus joined into LLM etc. to try to catch the new shiny thing.

                                      >You are alone in that opinion. These are some of the most profitable companies in history, which is why they make such a huge part of the S&P. You are talking about a feedback loop of investing, which is normal in any investment bull cycle. It can turn into a bubble and we may be at the start of one, but being an AI skeptic investor just means not participating and having poor returns. The future is uncertain and it sounds to me like you are looking for reasons not to invest.

                                      Please try to change the word Ai in this sentence to crypto to see how relevant it might be :>

                                      Also, this line kinda means "It may be a bubble but it pays right now" in the sense that you are basing your returns STILL on the fact of some predicted PE.

                                      I am just saying that people shouldn't consider S&P 500 "safe enough" then I suppose due to this AI hype if there is even a sheer possiblity of bubble formation.

                                      Higher profits generally mean higher risks and there is no free lunch. So S&P 500's higher profits does have a higher risk and people should know that risk before investing and my risk appetite doesn't support it and I am wondering how yours could.

                                      Superior returns aren't easy and if someone's saying them without giving you the underlying reason ie. realized productivity gains in an underlying trade (think a house builder built a house which was productive to the family and they are gonna pay for it) (compare it to how messy AI is, and how we haven't really still discussed on why there is so much hype in the market when the economy is doing kinda bad)

                                      > it sounds to me like you are looking for reasons not to invest.

                                      Yes, I naturally took the discussion from this side in investing in S&P markets and It's wild how you think so when I really agreed to you on a lot of things and your last line sort of sums it except when you look at the true gravitas of the situation, there is almost very little uncertainty about that (so no need for maybe)

                                      Sam Altman, CEO of OpenAI, has expressed concerns that the AI market may be experiencing a bubble, similar to the dot-com bubble of the late 1990s

                                      This is my opinion too.

                                      I was thinking of someone who wants to have a long time in the market as I think I said but time in the market beats the timing in teh market and so these "maybe" lines do frighten me. Do I want to mess around and find out if things are in a bubble with my money!? On a company which is massively enshittening itself in the names of AI (youtube auto dub comes to mind)

                                      This was a good faith discussion and I appreciate it but I don't agree on how it means not participating in bubble-ish maybe activities means you aren't getting returns, its like saying that I am not getting returns on crypto because I am not participating... because the whole thing is bubblish & those returns aren't magical...

                                      S&P should be considered a safe enough investment not something that is on the whims of a maybe, I suppose?

                                      I really like your last line I must admit, and it can take both an AI skeptic (AI skeptic in the sense that the tech is cool but its not gonna generate much profit given the investment) and pro AI person...

                                      > You are alone in this opinion ..

                                      I;d genuinely love to know if that's the case and I really wish to create an Ask HN, linking to this discussion as I don't think that my take is unreasonable?

    • resters 4 days ago ago

      At least the deal is denominated in watts rather than currency which may hyperinflate soon.

    • jononor 4 days ago ago

      I do not think the leveraging is going to end there. I suspect this will be used to justify/secure power generation investments, possibly even nuclear. Likely via one or more of the OpenAI/Altman adjacent power startups.

      • amluto 4 days ago ago

        On the bright side, if lots of power capacity is added and most of the GPUs end up idle, then there might be cheap power available for other uses.

        • jazzyjackson 4 days ago ago

          Power generation is not a monolithic enterprise. If more supply is built than needed, certain suppliers will go bankrupt.

          • ogaj 4 days ago ago

            They may, but that doesn’t mean that the capacity disappears. It may require some assumptions about USG willingness to backstop an acquisition but it’s not a significant leap to think that the generation capacity remains in (more capable?) hands.

            • mcny 4 days ago ago

              Speaking of capacity, what happened to all the "dark fiber" that was supposedly built for Internet 2 or whatever? The fiber doesn't go away just because a bubble burst, right?

              • HPsquared 4 days ago ago

                Railways are similar, many were built by investors who lost all their investment but the railway is still there.

          • lucianbr 4 days ago ago

            What are the chances suppliers will go bankrupt but the plants get sold and still produce power?

        • NewJazz 4 days ago ago

          Not if Ellison trickles it out for maximum profit.

        • holoduke 4 days ago ago

          And computing in general gets cheaper.

          • lawlessone 4 days ago ago

            heating our homes next winter with clusters of h100s

      • bobmcnamara 4 days ago ago

        Altman is all in on converting the solar system into a Dyson sphere to power OpenAI.

      • yibg 4 days ago ago

        Isn't that already happening via Oklo? Up 500%+ YTD.

    • lacy_tinpot 4 days ago ago

      This is more like reinvesting into the business as it's growing. It's a positive sum loop.

      Nvidia makes money by selling to OpenAI. OpenAI makes money by selling a service to users that uses Nvidia. So Nvidia invests in the build out and expansion of the infrastructure that will use Nvidia.

      This is a classic positive sum loop.

      It's not that different than a company reinvesting revenue in growing the company.

      • binarymax 4 days ago ago

        But is OpenAI recouping this? I remember seeing reports a year ago that it was in the realm of $700M/mo in inference costs for them - are they earning that now?

        Of course the strategy of taking a loss and reinvesting - but I don't see how OpenAI is making enough money to pay for all this, now or in the future.

        • lacy_tinpot 4 days ago ago

          It hasn't monetized any of its services. There are currently no ads. And it's not selling user data, well not yet.

          That's literally hundreds of billions worth of revenue.

          Just look at the options OpenAI has to generate revenue beyond subscriptions.

        • sssilver 4 days ago ago

          How would you "see" that, given that OpenAI isn't public?

    • big_toast 4 days ago ago

      Is this more of an accounting thing?

      Is there some (tax?) efficiency where OpenAI could take money from another source, then pay it to Nvidia, and receive GPUs. But instead taking investment from Nvidia acts as a discount in some way.

      (In addition to Nvidia being realistically the efficient/sole supplier of an input OpenAI currently needs. So this gives

        1. Nvidia an incentive to prioritize OpenAI and induces a win/win pricing component on Nvidia's GPU profit margin so OpenAI can bet on more GPUs now
      
        2. OpenAI some hedge on GPU pricing's effect on their valuations as the cost/margin fluctuates with new entrants
      )?
      • wmf 4 days ago ago

        It sounds like Nvidia has so much cash already that they would prefer to own x% of OpenAI instead.

    • throwaway667555 4 days ago ago

      It's not round tripping. Economically Nvidia is investing property is OpenAI. It's not investing nothing, far from it.

    • jedberg 4 days ago ago

      It's interesting how deals like this are politically relevant. Nvidia refused to do deals like this (investing in companies buying large amounts of NVIDIA GPUs) after they got the hammer from Biden's SEC for self dealing due to their investment in Coreweave.

      But now that there is a new SEC, they are doing a bunch of these deals. There is this one, which is huge. They also invested in Lambda, who is deploying Gigawatt scale datacenters of NVIDIA GPUs. And they are doing smaller deals too.

    • belter 4 days ago ago

      Same as they are doing with CoreWeave. In a sane world the SEC would do something but we are past that. What about Boeing opening an airline company and selling airplanes to itself?

    • stale2002 4 days ago ago

      Does anyone in the finance business know how legal this all it? I am hearing terms like "round tripping" being thrown around. A practice where a company sells and buys back its own product to artificially inflate revenue.

      I'm asking because its not just OpenAI that they are apparently doing this with, instead its with multiple other major GPU providers, like Coreweave.

      And its just being done all out in the open? How?

      • wmf 4 days ago ago

        IANAL but you can do pretty much anything as long as it's disclosed. The only problem with round-tripping is doing it secretly.

        As an investor you may decide that round-tripping is dumb but in that case your recourse is to sell the stock.

      • raincole 4 days ago ago

        First of all, it's not 'textbook round tripping' at all. The parent commenter is dead wrong but HNers upvote when they see "AI is a bubble."

        Textbook round tripping is like: OpenAI buys GPUs from Nvidia. And the only reason it buys these GPUs is to resell it back to Nvidia, or just do nothing. It doesn't make it round tripping just because OpenAI is taking investment and buying stuff from Nvidia at the same time.

        Unless you really believe OpenAI has no intention to use these GPUs for other purposes (like training GPT-6. I know, a crazy idea: OpenAI will train and release a model), it's not round tripping.

        • stale2002 4 days ago ago

          Its not just about OpenAI, though. Even though they have the biggest/flashiest deal. The other more obvious example is coreweave.

          > OpenAI buys GPUs from Nvidia. And the only reason it buys these GPUs is to resell it back to Nvidia

          Funny you should say this. Nvidia having those GPUs be rented back to them is also something thats happening.

          https://www.kerrisdalecap.com/wp-content/uploads/2025/09/Ker...

          "As detailed by The Information, in early 2023 Nvidia invested $100 million in equity and signed a $1.3 billion rental agreement through 2027, under which it rents back GPUs from CoreWeave to support internal R&D and its DGX cloud offering."

          "CoreWeave is not the only neocloud to benefit from Nvidia’s strategic support. Nvidia has actively supported an ecosystem of emerging AI infrastructure providers – including Lambda, Nebius, and Applied Digital –"

          They are quite literally buying GPUs only to rent them right back to Nvidia.

          And these are just the public deals. Is Nvidia systematically selling GPUs and having them be rented back to, by every major GPU cloud providers?

          https://www.investing.com/analysis/coreweave-nvidia-partners...

          "This deep alliance culminates in the new $6.3 billion agreement. The deal’s most critical component is a strategic commitment from NVIDIA to purchase any of CoreWeave’s unsold cloud computing capacity through April 2032"

      • yard2010 4 days ago ago

        How I see it - the people with the money make the rules, why would they make rules against themselves?

        • stale2002 4 days ago ago

          Yes, but my point is that this almost feels like an Enron case. Things were fine, until they weren't. And then in retrospect the fraud is obvious.

          I'm just surprised that nobody is yelling to the rooftops about practices that are just so out in the open right now.

    • eitally 4 days ago ago

      I'm not saying you're wrong, but with Nvidia pulling back from DGX Cloud, it makes sense that they'd continue to invest in their strategic partners (whether it's software companies like OpenAI or infrastructure vendors like Coreweave).

    • gdiamos 4 days ago ago

      It forces us to confront a question.

      How much investment and prioritization in scaling laws is justified?

      • aldousd666 4 days ago ago

        Regardless of the scaling hypothesis, they need the compute to serve the models at scale.

    • radium3d 4 days ago ago

      Really curious how xAI is working out financially. Grok blows me away for coding.

      • radium3d 4 days ago ago

        It's interesting how profitable Tesla is despite the huge investments in their AI training infrastructure. They seem to be one of the best positioned companies that can maintain enough profitability to be able to afford their AI infrastructure without issue.

        • mirekrusin 4 days ago ago

          Google?

          • radium3d 3 days ago ago

            Gemini isn’t the best though, I’ve found it not nearly as good as grok

    • dummydummy1234 4 days ago ago

      My thought is think of all the really cheap compute that will be available to researchers. Sure, it will crash but at the end of the day there will be a huge glut of gpus that datacenters will be trying to rent out near cost.

      I (as a uninformed rando) think that there are a lot of research ideas that have not been fully explored because doing a small training run takes 100k. If that drops to 1000, then there is a lot more opportunities to try new techniques.

      • koolala 3 days ago ago

        With this level of power usage it might be the opposite. Once they can't subsidize the cost it might increase.

    • zitterbewegung 4 days ago ago

      I do think about this where they are creating a printing / cash burning cycle where both OpenAI keeps on doing raises and Nvidia can get more sales...

    • javiramos 4 days ago ago

      "...man this is gonna crash hard when reality sets in."

      Given the amount of money invested and the expectations, the crash will be of cataclysmic proportions

    • kelvinjps 4 days ago ago

      The're just buying and investing from each other?

    • mrandish 4 days ago ago

      > throwing more cards on a house of cards.

      Nice metaphor! Huge bubbles usually get a historical name like "Tulip Craze" or "Dot Com Crash" and when this bubble bursts "House of Cards" is a good candidate.

      • neilv 4 days ago ago

        Oh, I see now: house of cards (usual meaning) + throwing more cards on (like throwing money on the fire, and also how you destabilize house of card) + GPU cards in this case (even though they're not necessarily cards). I like it.

      • jama211 4 days ago ago

        I just hope it works out just like the dot com crash in the long run - which is that the internet kept going and bringing real value it just needed a big market reset when it popped.

    • SilverElfin 4 days ago ago

      I also recall reading that OpenAI is developing its own chips. What happened to that?

      • aldousd666 4 days ago ago

        I don't think the NVIDIA deal is an exclusive one... They can still use TPUs and GPUs and other cloud providers if they like. They may still be planning to.

    • elorant 4 days ago ago

      Well I hope it crashes so we can get back to normalized GPU prices.

    • vessenes 4 days ago ago

      Almost every model trained by the majors has paid for itself with inference fees.

      I’m not saying there isn’t a bubble, but I am saying if the researchers and strategists absolutely closest to the “metal” of realtime frontier models are correct that AGI is in reach, then this isn’t a bubble, it’s a highly rational race. One that large players seem to be winning right now.

      • jsheard 4 days ago ago

        > Almost every model trained by the majors has paid for itself with inference fees.

        Even if we assume this is true, the downstream customers paying for that inference also need it to pay for itself on average in order for the upstream model training to be sustainable, otherwise the demand for inference will dry up when the music stops. There won't always be a parade of over-funded AI startups burning $10 worth of tokens to bring in $1 of revenue.

        • Rover222 4 days ago ago

          My employer spends $100k/month or more on OpenAI fees. Money well spent, in both product features and developer process. This is just one fairly small random startup. Thousands of companies are spending this money and making more money because of it.

          • Rebuff5007 4 days ago ago

            Curious what makes you think the money is well spent.

            I can maybe digest the fact that it helped prototype and ship a bit more code in a shorter time frame... but does that warrant in enough new customers or a higher value product that would justify $100k a month?!

            • Rover222 3 days ago ago

              Probably 80% of that money goes towards product features that are crucial to retention and acquisition of customers, and the business is profitable. Could those features exist without AI integrations? Some yes, but the data would be limited/inferior, other features would not be possible at all.

              The 20% spent on dev tooling seems well-spent. About 10 devs on the team, and all at least 2x (hard to measure exactly, but 2x seems conservative) more productive with these tools.

            • neutronicus 4 days ago ago

              Some of that $100k/month might be powering the features, rather than supporting development.

              • Rover222 3 days ago ago

                yeah it's probably 80% going to product features (processing/classifying data, and agentic workflow features), and 20% to dev tools

        • onesociety2022 4 days ago ago

          Isn't most of OpenAI revenue from end users and not revenue from token sales? For Anthropic, it is the opposite where almost all of their revenue comes from API usage. So even if AGI/ASI don't pan out, OpenAI will have a great consumer-focused inference business where they build useful applications (and new devices) using existing state-of-the-art LLMs and stop investing heavily in the next gen model training? I think potentially just replacing Google Search and smartphones with a new AI device would be massive consumer businesses that OpenAI could potentially go after without any major advancements in AI research.

        • vessenes 3 days ago ago

          I’m the other way — the cost of launching a creative / interesting software company / project just got cut to 1% or so. (I said launching. Maintaining … obviously not quite as good on the numbers).

          I propose software creation, and therefore demand for software creation are subject to Jevon’s Paradox.

        • ben_w 4 days ago ago

          Tokens that can be purchased for $10 may or may provide the purchaser with almost any dollar denominated result, from negative-billions* to postive-billions**.

          Right now, I assume more the former than the latter. But if you're an optimistic investor, I can see why one might think a few hundred billion dollars more might get us an AI that's close enough to the latter to be worth it.

          Me, I'm mostly hoping that the bubble pops soon in a way I can catch up with what the existing models can already provide real help with (which is well short of an entire project, but still cool and significant).

          * e.g. the tokens are bad financial advice that might as well be a repeat of SBF

          ** how many tokens would get you the next Minecraft?

      • sylario 4 days ago ago

        The thing is that AI researchers that are not focused on only LLM do not seem to think it is in reach.

        • sindriava 4 days ago ago

          Demis Hassabis seems to think this and not only does he not focus only on LLMs, he got a nobel prize for a non-LLM system ;)

          • belter 4 days ago ago

            As far as I know, that Nobel prize was for being the project manager...

            • vessenes 3 days ago ago

              If you talk to any of his early investors, they considered him absolutely crucial to the project.

              • belter 2 days ago ago

                They say the same about Sam Altman....

      • mossTechnician 4 days ago ago

        Which of these model-making companies have posted a profit? I'm not familiar with any.

        • vessenes 4 days ago ago

          They account internally for each model separately; Dario said they even think of each model as a separate company on Dwarkesh some time ago.

          Inference services are wildly profitable. Currently companies believe it’s economically sensible to plow that money into R&D / Investment in new models through training.

          For reference, oAI’s monthly revs are reportedly between $1b and $2b right now. Monthly. I think if you do a little napkin math you’ll see that they could be cashflow positive any time they wanted to.

          • jenkinomics 4 days ago ago

            Again with the "this is very profitable if you don't account for the cost of creating it?"

            Then my selling 2 dollars for 1 dollar is a wildly profitable business as well! Can't sell them fast enough!

            Why does it seem like so many people have ceased to think critically?

            • vessenes 3 days ago ago

              You have LLM derangement syndrome, and don’t understand.

              Say the first model cost $2 to make. On metered sales, they’ve made $10 on it.

              They then decide to make a $20 model, raising more money. It turns out, that model made $100.

              They then decide to make a $1,000 model. That model made $5,000.

              There are two possible paths for their shiny new $10,000 model: either it will be a better market fit than the 1k model, or it will not.

              If it is a better market fit than the 1k model, then it seems very likely that at some point it will make more than $10,000 (2x the prior model’s utility).

              If it does not provide better value, then you can scrub that model, and keep selling the $1k model. Eventually it will likely provide the additional $5k of investor capital back through profits.

              What we have seen is this above scenario, with a couple twists: first, the training (capital investment) decisions overlay the useful life of the prior model, so you have to tease out the profitability when you think strategy. Second, it turns out there’s quite a lot of money to be made distilling models the market likes into models that give like 90% better profit.

              So, these businesses paying billions of dollars to train frontier models are absolutely rational actors. They are aggressive actors, engaged in an arms race, and not all of them will survive. But right now, with current inference demand, if all the global training capital dried up, (and therefore we are stuck with current models for some time), they would become highly, highly profitable companies during the period where fast followers tried to come in and compete on price.

              • nouarngin a day ago ago

                Is the profitable model that makes 5x its cost in revenue here in the room with us right now?

            • neutronicus 4 days ago ago

              OpenAI claims that each GPT generation has sold enough inference at high enough margin to recoup the cost of training it.

              The company overall is still not profitable because these proceeds are being used to fund training the next GPT generation.

      • ACCount37 4 days ago ago

        Ever since NLP and CSR, the two unassailable fortresses of every AI winter, fell to LLMs? I had no doubt that AGI is within reach.

        It's less "will it happen" now, and more "whether it hits in a few decades or in a few years".

      • mountainriver 4 days ago ago

        The idea that it’s a bubble on the frontier model side is insane. AI assisted coding alone makes it the most valuable thing we’ve ever created.

        • switchers 4 days ago ago

          Get your head out of the proverbial, a bullshitting machine that lets some developers do things faster if they modify how they develop isn't even close to the most valuable thing we've ever created.

          • vessenes 3 days ago ago

            I think you’re wrong. Consider the following. It’s 1995. You and your next door neighbour Jeff Bezos have both just raised $10mm from competing VCs to build amazon.com.

            You can choose to have a Claude API portal to the future where you pay 2025 prices for token inference, or you can skip it, and use 1995 devs to build your competitor.

            Which do you do?

          • mountainriver 4 days ago ago

            It easily is, nothing else is even remotely close. Software is the most valuable industry on earth and we are well on our way to fully commoditizing it.

        • vessenes 3 days ago ago

          Totally agreed.

    • glitchc 4 days ago ago

      Does this mean they pay for it through consumer GPU sales?

      • threeducks 4 days ago ago

        Last quarter, NVIDIA reported data-center revenue of $30.8 billion and gaming revenue of only $3.3 billion.

        https://nvidianews.nvidia.com/news/nvidia-announces-financia...

        This also explains why NVIDIA will not sell high VRAM consumer GPUs: it would cannibalize on their exorbitant data-center profits.

        • mirekrusin 4 days ago ago

          If they won't, somebody else will. And frankly that alone can pop their bubble - it minimizes/locks margins they can ever charge, a margin that is already negative. Apparently for every $1 made they currently pay $2.25?

      • wmf 4 days ago ago

        No, those are a drop in the bucket.

    • andai 4 days ago ago

      I think you have just described the global economy.

    • jama211 4 days ago ago

      Perhaps I should short openAI… would you try it?

      • jama211 3 days ago ago

        I also ask this as a rationalist technique, the moment you ask “what outcome would you actually put money on” people suddenly get far more realistic about how confident they actually feel. You get a whole lot less “oh they’re DEFINITELY gonna fail/succeed!” type hyperbole when money is on the line.

    • anothermathbozo 4 days ago ago

      What is this a bubble on? What does said bubble collapsing look like?

      • bitmasher9 4 days ago ago

        Nvidia is giving OpenAi money (through investment) to buy Nvidia chips. The bubble is that Nvidia got that money from its crazy high stock price, the extra investment raises OpenAi’s evaluation and the increased sells raises Nvidia’s evaluation. If the valuations see a correction then spending like this will decrease, further decreasing valuations.

        Bubble collapsing looks like enshittification of OpenAI tools as they try to raise revenues. It’ll ripple all throughout tech as everyone is tied into LLMs, and capital will be harder to come by.

        • drexlspivey 4 days ago ago

          > The bubble is that Nvidia got that money from its crazy high stock price,

          This is totally False, NVDA has not done any stock offerings. The money is coming from the ungodly amount of GPUs they are selling. In fact they are doing the opposite, they are buying back their stock because they have more money that they know what to do with.

          • JCM9 4 days ago ago

            A company buys back its stock if it thinks the stock is underpriced. Otherwise when “you have more money than you know what to with” you give it to your shareholders via a dividend. A concept mostly forgotten by tech companies.

            • rhetocj23 3 days ago ago

              Ermm this is nothing but a wealth transfer from the shareholders who sell at too low a price, to those who dont.

            • drexlspivey 4 days ago ago

              A company buys back stock because distributing dividends incurs a 30% withholding tax.

              • rhetocj23 3 days ago ago

                Sorry guys but this is why I dont want to see many finance related posts here, because very few know what they are talking about.

                Buybacks are the preferred method of RETURNING CASH to shareholders, because dividends historically have been sticky. Buybacks are flexible.

                Buybacks are also done to optimise the debt ratio, to minimise the firms cost of capital and thereby maximizing firm value.

        • vessenes 4 days ago ago

          NVDA outstanding shares are down ~1.2% year over year; the company has been buying back its own shares with —>> profits <<— to the tune of tens of billions.

          Meanwhile NVDA stock is mildly up on this news, so the current owners of NVDA seem to like this investment. Or at least not hate it.

          Agreed that we’ll see ad-enabled ChatGPT in about five minutes. What’s not clear is how easily we’ll be able to identify the ads.

        • mountainriver 4 days ago ago

          Valuations won’t see a correction for the core players, I have no idea why people think that. Both of these companies are already money factories.

          Then consider we are about to lower interest rates and kick off the growth cycle again. The only way these valuations are going is way up for the foreseeable future

        • babelfish 4 days ago ago

          > Bubble collapsing looks like enshittification of OpenAI tools as they try to raise revenues

          Why does monetizing OpenAI tools lead to bubble collapse? People are clearly willing to pay for LLMs

          • bitmasher9 4 days ago ago

            You read this backwards. If the bubble collapses we will see OpenAI raise capital by increasing revenue instead of investment.

      • shawabawa3 4 days ago ago

        AI and tech companies

        Collapse might look a little like the dot com bubble (stock crashes, bankruptcies, layoffs, etc)

        • wongarsu 4 days ago ago

          And it's worth reiterating that a bubble does not mean the technology is worthless. The dot com bubble collapsed despite the internet being a revolutionary technology that has shaped every decade since. Similarly LLMs are a great and revolutionary technology, but expectations, perception and valuations have grown much faster than what the technology can justify

          These hype cycles aren't even bad per se. There is lots of capital to test out lots of useful ideas. But only a fraction of those will turn out to be both useful and currently viable, and the readjustment will be painful

        • HarHarVeryFunny 4 days ago ago

          Plus unused dark fiber = unused AI data centers and power generation capacity.

      • jenkinomics 4 days ago ago

        I think ultimately the conclusion that we're in a bubble is bad analysis. It jumps over a chasm and assumes that analogy to past historical situations allows us to draw conclusions.

        This isn't a bubble. This is the collapse of 300 years of modern capitalism into corporate techno feudalism.

        This won't crash and lead to a recession or depression. We are at the end game. Look around you. Capital is going scorched earth on labor. They are winning. Cost of living in metropolitan areas is exploding, and most of us will end up begging for scraps in peripheral areas.

        This is the result of everything the elites have been working towards for the past few decades. Climate catastrophe is the cherry on the cake: they will shock therapy us into the last few bits. There will be corporate citizenship that enables one to live as a demi-god at the behest of the owners, and survival in the wastelands for the rest of us.

      • Drunkfoowl 4 days ago ago

        High end server gpus and AI roi expectations.

        • reactordev 4 days ago ago

          I think everyone is underestimating the advancements in wafer tech and server compute over the last decade. Easy to miss when it’s out of sight out of mind but this isn’t going anywhere but up.

          The current SOTA is going to pale in comparison to what we have 10 years from now.

          • zer00eyz 4 days ago ago

            > I think everyone is underestimating the advancements in wafer tech and server compute over the last decade.

            What advancements?

            We have done a fabulous job at lowering power consumption while exponentially increasing density of cores and to a lesser extent transistors.

            Delivering power to data centers was becoming a problem 20 ish years ago. Today Power density and heat generation are off the charts. Most data center owners are lowering per rack system density to deal with the "problem".

            There are literal projects pushing not only water cooling but refrigerant in the rack systems, in an attempt to get cooling to keep up with everything else.

            The dot com boom and then Web 2.0 were fueled by Mores law, by Clock doubling and then the initial wave of core density. We have run out of all of those tricks. The new steps that were putting out have increased core densities but not lowered costs (because yields have been abysmal). Look at Nvidia's latests cores, They simply are not that much better in terms of real performance when compared to previous generations. If the 60 series shows the same slack gains then hardware isnt going to come along to bail out AI --- that continues to demand MORE compute cycles (tokens on thinking anyone) rather than less with each generation.

    • Rebuff5007 4 days ago ago

      Sure, but its going to be a great plot for the movie that comes out in five years.

    • chairmansteve 4 days ago ago

      I agree that it is a bubble.

      But the "round tripping" kind of makes sense. OpenAI is not listed, but if it was, some of the AI investment money would flow to it. So now, if you are an AI believer, NVidia is allocating some of that money for you.

    • ivape 4 days ago ago

      You know you can sell inferencing at near 100% margins, right? More, even.

    • pixelready 4 days ago ago

      The real question is not whether this is a bubble since as you mentioned even if AI settles into a somewhat useful semi-mainstream tech, there is no way any of the likely outcomes can justify this level of investment.

      The real question is what are we gonna do with all this cheap GPU compute when the bubble pops! Will high def game streaming finally have its time to shine? Will VFX outsource all of its render to the cloud? Will it meet the VR/AR hardware improvements in time to finally push the tech mainstream? Will it all just get re-routed back to crypto? Will someone come up with a more useful application of GPU compute?

      • halJordan 4 days ago ago

        Ai is already in semi-useful mainstream tech. There's a massive misunderstanding on this site (and other neo luddite sites) that somehow there is no "long tail" of business applications being transformed into ai applications.

        • lawlessone 4 days ago ago

          any examples?

          • sindriava 4 days ago ago

            Current systems are already tremendously useful in the medical field. And I'm not talking about your doctor asking ChatGPT random shit, I'm saying radiology results processing, patient monitoring, monitoring of medication studies... The list goes on. Not to mention many of the research advances done using automated systems already, for example for weather forecasting.

            • AlexandrB 4 days ago ago

              I'm getting real "put everything on the blockchain" vibes from answers like this. I remember when folks were telling me that hospitals were going to put patient records on the blockchain. As for radiology, it doesn't seem this use of AI is as much of a "slam dunk" as it first appeared[1][2]. We'll see, I guess.

              Right now I kind of land on the side of "Where is all the shovelware?". If AI is such a huge productivity boost for developers, where is all the software those developers are supposedly writing[3]? But this is just a microcosm of a bigger question. Almost all the economic growth since the AI boom started has been in AI companies. If AI is revolutionizing multiple fields, why aren't relevant companies those fields also growing at above-expected rates? Where's all this productivity that AI is supposedly unlocking?

              [1] https://hms.harvard.edu/news/does-ai-help-or-hurt-human-radi...

              [2] https://www.ajronline.org/doi/10.2214/AJR.24.31493

              [3] https://mikelovesrobots.substack.com/p/wheres-the-shovelware...

            • lawlessone 4 days ago ago

              Ok, but i am asking for uses for LLMs specifically.

              Of course i agree ML has already helped in many other areas and has a bright future. But the thing everyone is talking about here are LLM's

      • ACCount37 4 days ago ago

        "The bubble will pop any minute now, any second, just you wait" is cope.

        Even if AI somehow bucks the trend and stops advancing in leaps? It's still on track to be the most impactful technology since smartphones, if not since the Internet itself. And the likes of Nvidia? They're the Cisco of AI infrastructure.

        • HarHarVeryFunny 4 days ago ago

          The dot com bubble popped. It doesn't mean that the internet wasn't successful, just that people got way too excited about extrapolating growth rates.

          AI is here to stay, but the question is whether the players can accurately forecast the growth rate, or get too far ahead of it and get financially burnt.

        • ben_w 4 days ago ago

          The importance of the Internet didn't prevent the .com bubble from bursting.

    • cyanydeez 4 days ago ago

      Amells like yahoo driven 2000 bubble. Definitely short every ancillsry business involved

  • lvl155 4 days ago ago

    We are definitely closer to the top in this market. Do people even realize what they’re predicting in terms of energy use? It’s going to be a wasteland territory sooner than people think.

    • webdevver 4 days ago ago

      its going to go 10x from where it is now

  • cainxinth 4 days ago ago

    Next Year: OpenAI announces it is seeking funding for a Dyson Sphere

    • wiseowise 4 days ago ago

      Where do I sign? Finally back to space race instead of era of social network degradation.

      • amlib 3 days ago ago

        I don't think it's a good idea to have the same sociopaths who brought us the current status quo be propping up a new space race...

    • BHSPitMonkey 4 days ago ago

      The production rate of ~~paperclips~~ tokens isn't growing quickly enough!

  • andirk 3 days ago ago

    Most salient point of all of this: gigawatts is pronounced "jigga watts", like Back to the Future's Doc does correctly.

  • bertili 4 days ago ago

    Whats in it for Nvidia? At the recent 300B valuation, 25% equity?

  • gessha 4 days ago ago

    I feel like data center deployments is the new metric that companies choose to show growth vs the old headcount growth.

  • 0xTJ 4 days ago ago

    Stating compute scale in terms of power consumption is such a backwards metric to me, assuming that you're trying to portray is as something positive.

    It's like selling steel by the average fractional number of mining deaths that went into producing it. Sure, at a given moment there will be some ratio between average deaths and steel, but that's a number that you want to be as low as possible.

    • dmoy 4 days ago ago

      Stating compute scale in terms of power consumption is exactly how one looks at data centers or capacity planning right now though. It's the major constraint.

      It's just a different abstraction level.

  • rippeltippel 4 days ago ago

    Life was so much cheaper in the '80, when you could travel in time with just 1.21GW.

  • hangonhn 4 days ago ago

    Anyone else find it fascinating that gigawatt/unit of power is the metric used for this deal?

  • beastman82 4 days ago ago

    I only have an 600W computer

    • tim333 3 days ago ago

      Mine is using 9W according to Coconut Battery and working fine for most purposes.

  • rawgabbit 4 days ago ago

    Does this affect OpenAI’s renegotiation of their deal with Microsoft?

  • cubefox 4 days ago ago

    To the people who are calling this evidence of a bubble: There is no credible indication that AI in general is a bubble, even if not all investments will make sense in retrospect. Quite the opposite, the progress in the field over the last few years is staggering. AI systems are becoming superhuman at more and more tasks. It's only a question of time till AI will outperform us at everything.

    • jcranmer 4 days ago ago

      > There is no credible indication that AI in general is a bubble, even if not all investments will make sense in retrospect.

      If you add up all of the contracts that OpenAI is signing, it's buying something like $1 trillion/year worth of compute. To merely break even, it would have to make more money than literally every other company on the planet, fairly close to twice the current highest revenue company (Walmart, a retailer, which, yeah, there's a reason that has high revenue).

      • cubefox 4 days ago ago

        They are aiming at being the first to develop an AGI and eventually superintelligence. Something that can replace human workers. Walmart is small fish in comparison. OpenAI is currently in the lead, so their chances are decent.

        • tptacek 3 days ago ago

          This is an argument that OpenAI needs to achieve a supernatural outcome in order to be a financial success.

          • cubefox 3 days ago ago

            So you think superhuman intelligence is supernatural.

            • tptacek 3 days ago ago

              If you want to substitute "science fiction" that's fine too. We generally don't bank real investment expectations on science fiction outcomes. The positive expectation scenario you've provided is "OpenAI obsoletes workers, to the extent that Walmart is small fish". That's a sci-fi outcome, not a rational expectation.

              • cubefox 3 days ago ago

                You would have called ChatGPT or Dall-E science fiction just shortly before it actually came out.

                • tptacek 3 days ago ago

                  Perhaps. Also time travel machines. But for different reasons.

    • mikhmha 4 days ago ago

      There is no indication of being in a bubble when you're actually in one. Its only after the bubble pops do people recognize it in hindsight. Otherwise there would be no bubbles and we wouldn't see large institutions fall for this crap.

      What is a credible indication? Who is credible? Its all subjective. Its possible to fool yourself endlessly when financial incentives are involved. The banks did it with mortgages.

  • shmerl 4 days ago ago

    They should spend gigawatts on something more useful instead.

  • xyst 4 days ago ago

    This is awful. You should know the private companies building these datacenters often get back door deals with PUCs. They do NOT pay their fair share in their consumption and the extra cost is shouldered onto the general rate payers.

    More degenerate "privatizing of the profits, socializing the profits" behavior. American public continues to get bent by billionaires and continue to elect folks that will gladly lube them up in preparation of that event.

  • dboreham 4 days ago ago

    Perpetual Money Machine

  • mrheosuper 4 days ago ago

    i am wondering do they still use 48v in those computers ? That's a lot of amp.

  • catigula 4 days ago ago

    Can we get some laws to force these companies to start subsidizing the consumer grids they're pummeling?

    The electric bills are getting out of hand.

    • vmg12 4 days ago ago

      They would build their own power lines / grid if they could.

      • bananapub 4 days ago ago

        ... why? the current (heh) situation is that they do these big announcements and then local/state governments around the US get in a bidding war to try to shift costs from the datacenter operator on to their own citizens, in addition to offloading all of the capex.

      • catigula 4 days ago ago

        No thanks, I'll just take subsidies to my bill.

        • vmg12 4 days ago ago

          Well, I suggest you go into politics and do something about it rather than be pointlessly smug on the internet.

          If you're actually interested, the reason it's important to build out the grid even more instead of "subsidizing" is because the current grid can't handle renewables well which we need to improve if we want to use sustainable energy.

    • 2OEH8eoCRo0 4 days ago ago

      What will happen if/when the AI bubble pops and there is far more grid capacity than demand? Power plant bailouts?

      • davis 4 days ago ago

        Load growth for the last 15 years has been very small but load growth going forward is expected to rise due to electrification of all things to decarbonize the economy. This means home heating, electrical cars, heavy industries, obviously data centers and the list goes on. So even if we have more grid capacity than demand (this seems unlikely), it will be used before too long.

      • JCM9 4 days ago ago

        Will just make capacity available for electrification of other infrastructure like heat pumps, electric cars, and so on. Lots of other folks would happily buy that power. The whole AI bubble is just driving up electricity pricing for everyone else at the moment.

  • fennecbutt 4 days ago ago

    I do think that us humans are gonna cook ourselves on this planet.

    However at least AI is doing _something_ with the energy. Cryptocurrency is such a fucking useless waste of energy I'd take anything over it.

  • danielfalbo 4 days ago ago

    And the bubble keeps bubble-ing

  • mrheosuper 4 days ago ago

    The Earth become warmer, and we spend a lot of money to make it 10GW faster warming.

  • mrcwinn 4 days ago ago

    Very foolish of them not to leverage SoftwareFPU. And with minimal effort Performas are rackable.

  • elzbardico 4 days ago ago

    The good thing about this, is that when the AI bubble bursts, we will have a lot of energy infrastructure that wouldn't be built otherwise.

  • nicman23 4 days ago ago

    so like 3 racks of h100?

  • lumenwrites 4 days ago ago

    Yaay, one step closer to torment nexus.

    • nh23423fefe 4 days ago ago

      low effort comment, whose content is a stale reference to other low effort memes

  • zmmmmm 4 days ago ago

    there's a worrying lack of structural integrity building up in this hype bubble and this adds more fuel to the fire.

    You essentially have Nvidia propping up its own valuation here by being its own customer. If they sold a bunch of H100's to themselves and then put it as revenue on their books they'd be accused of fraud. Doing it this way is only slightly better.

  • truelson 4 days ago ago

    Ben Thompson and Doug O'Laughlin ( https://stratechery.com/2025/the-oracle-inflection-point-app... (paywall), https://www.fabricatedknowledge.com/p/capital-cycles-and-ai ) are calling it a bubble, largely because we've entered the cycle where cash flows aren't paying for it, but debt is (See Oracle. they won't be able to pay for their investment with cash flow).

    I think even Byrne Hobart would agree (from his interview with Ben): -- Bubbles are this weird financial phenomenon where asset prices move in a way that does not seem justified by economic fundamentals. A lot of money pours into some industry, a lot of stuff gets built, and usually too much of it gets built and a bunch of people lose their shirts and a lot of very smart, sophisticated people are involved with the beginning, a lot of those people are selling at the peak, and a lot of people who are buying at the peak are less smart, less sophisticated, but they’ve been kind of taken in by the vibe and they’re buying at the wrong time and they lose their shirts, and that’s really bad. --

    This is a classic bubble. It starts, builds, and ends the same way. The technology is valuable, but it gets overbought/overproduced. Still no telling when it may pop, but remember asset values across many categories are rich right now and this could hurt.

  • rvz 3 days ago ago

    This is a dot-com level bubble.

  • t0lo 4 days ago ago

    "Hey, there's a bubble"

  • pmdr 4 days ago ago

    If I had shovels to sell, I'd definitely announce a strategic partnership to have a huge quarry dug by hand.

    Seriously, is there anyone in the media keeping unbiased tabs on how much we're spending on summarizing emails and making creatives starve a little more?

    • CodingJeebus 4 days ago ago

      Ed Zitron is an AI skeptic from the market perspective, highly recommend his stuff. It’s definitely not comforting to read, but he’s doing the math behind these headlines and it’s not adding up at all[0]

      0: https://www.wheresyoured.at/

      • pmdr 4 days ago ago

        Yeah, I pretty much agree with what he's been doing. But he's not what the average person would call 'media,' so his reach is severely limited.

  • andreicaayoha 4 days ago ago

    pls

  • iphone_elegance 4 days ago ago

    fancy stock buyback lol

  • sheerun 4 days ago ago

    <3

  • eagerpace 4 days ago ago

    Where is Apple? Even from an investment perspective.

    • threetonesun 4 days ago ago

      My MacBook Pro runs local models better than anything else in the house and I have not yet needed to install a small nuclear reactor to run it, so, I feel like they're doing fine.

    • rubyfan 4 days ago ago

      Being rationale.

      • fancyfredbot 4 days ago ago

        Rational.

        • rubyfan 4 days ago ago

          Ha, that too.

        • newfocogi 4 days ago ago

          Maybe we're not sure if they're being rational or rationalizing.

    • bertili 4 days ago ago

      Apple doing fine and often spend the same 100B in a year buying back Apple stocks.

    • brcmthrowaway 4 days ago ago

      Losing the race

      • gpm 4 days ago ago

        Right, but is the race to the pot of gold, or the stoplight (in which case by "losing" they save on gas)?

      • richwater 4 days ago ago

        This is not something that can be won. The LLM architecture has been reaching it's limitations slowly but surely. New foundational models are now being tweaked for user engagement rather than productive output.

  • FrankyHollywood 4 days ago ago

    $100 billion, what a number. It makes me a bit cynical. The amount of useful developments you could finance in either clean energy, education, nature preservation, medicine, anything.

    But no, let's build us a slightly better code generator.

    Strange times we live in...

  • EcommerceFlow 4 days ago ago

    If Solar can't compete with natural gas economically, and subsidizing solar ends up de-incentivizing natural gas production by artificially lowering energy prices, what's the solution here?

    • henearkr 4 days ago ago

      Your question is weird.

      Solar does compete economically with methane already, and it's only going to improve even more.

      • EcommerceFlow 4 days ago ago

        If true, why aren't we mass scaling it all over the American West? We have railways running from West -> East, why not include power lines that can take power from energy farms in the West -> East? No major project in AZ, TX, or CA to give a city free power? etc

        • ux266478 4 days ago ago

          > We have railways running from West -> East, why not include power lines that can take power from energy farms in the West -> East?

          Firstly, there is no such thing as an infinitely scaling system.

          Secondly, because power transmission isn't moving freight. The infrastructure to move electricity long distances is extremely complicated. Even moving past basic challenges like transmission line resistance and voltage drop, power grids have to be synchronized in both phase and frequency. Phase instability is a real problem for transmission within hundreds of miles, let alone thousands upon thousands.

          Also that infrastructure is quite a bit more expensive to build than rail or even roads, and it's very maintenance hungry. An express built piece of power transmission that goes direct from a desert solar farm to one of the coasts is just fragile centralization. You have a long chain of high-maintenance infrastructure, a single point of failure makes the whole thing useless. So instead you go through the national grid, and end up with nothing, because all of that power is getting sucked up by everyone between you and the solar farm. It probably doesn't even make it out of the state it's being generated in.

          BTW the vast majority of the cost of electricity is in the infrastructure, not its generation. Even a nuclear reactor is cheap compared to a large grid. New York city's collection of transmission lines, transformers, etc. (not even any energy generation infrastructure, just transmission) ballparks a couple hundred billion dollars. Maintenance is complex and extremely dangerous, which means the labor is $$$$. That's what you're paying for. That's why as we continue to move towards renewables price/watt will continue to go up, even though we're not paying for the expensive fuel anymore. The actual ~$60 million worth of fuel an average natural gas plant burns in a year pales in comparison to the billions a city spends making sure the electrons are happy.

        • gpm 4 days ago ago

          60% tariffs on solar components from China, an executive that is actively hostile to renewable energy, and you still are massively scaling it to some extent.

          67% of new grid capacity in the US was solar in 2024 (a further 18% was batteries, 9% wind, and 6% for everything else). In the first half of 2025 that dropped to 56% solar, 26% batteries, 10% wind, and 8% everything else (gas). Source for numbers: https://seia.org/research-resources/solar-market-insight-rep...

        • henearkr 4 days ago ago

          It is massively scaling everywhere, and notably in Texas btw.

        • philipkglass 4 days ago ago

          Getting approval across multiple states for lines takes a very long time. The federal government and just about any state, municipality, or private land owner along the proposed route can block or delay it. The TransWest Express transmission line project started planning in 2007 but couldn't start construction until 2023, and it only needed to cross 4 states.

          If the coast-to-coast railways hadn't been built in the past, I don't think the US could build them today. There are too many parties who can now block big projects altogether or force the project to spend another 18 months proving that it should be allowed to move forward.