AI won't use as much electricity as we are told (2024)

(johnquigginblog.substack.com)

63 points | by hirpslop a day ago ago

121 comments

  • jerf a day ago ago

    It's been a while, but I don't recall any of the dotcom startups making deals with nuclear energy companies to buy out entire nuclear power stations: https://www.npr.org/2024/09/20/nx-s1-5120581/three-mile-isla...

    And that's just an example, there are many power-related deals of similar magnitude.

    The companies building out capacity certainly believe that AI is going to use as much power as we are told. We are told this not on the basis of hypothetical speculation, but on the basis of billions of real dollars being spent on real power capacity for real data centers by real people who'd really rather keep the money in question. Previous hypotheses not backed by billions of dollars are not comparable predictions.

    • kyledrake a day ago ago

      > The companies building out capacity certainly believe that AI is going to use as much power as we are told.

      The same could be said of dark fiber laid during the dot com boom, or unused railroads, etc. Spending during a boom is not indicative of properly recognized future demand of resources.

      • jerf a day ago ago

        It's not future demand, it's current demand. Microsoft has already said they're power blocked rather than chip blocked.

        Please note how I say current demand, and don't over project as to what my opinion about future demand is. I think there's a small, but reasonable chance that demand will sink for some reason or another in the next few years, and I think there's a pretty decent chance that in the next five years someone will come up with some way to make these things an order of magnitude or more more efficient, which would crash their electricity demands. But it's not a hypothetical "we need power in two years", or at least, not just that... it's we need more power now.

        There's a big difference between "I may hypothetically need some more capacity later, I'd better go buy it now" and "I concretely need more capacity right now".

    • skybrian a day ago ago

      Yes, big bets tell us something but they are not a crystal ball. Some of the same companies hired lots of people post-pandemic and then reversed. People who control enormous amounts of money can make risky bets that turn out to be wrong.

      • tharmas a day ago ago

        There's a theory BigTech hired to hoard talent to inhibit competitiveness of rival companies.

        • kjkjadksj a day ago ago

          That is not a theory and not limited to big tech either.

    • belter a day ago ago

      "Exposing The Dark Side of America's AI Data Center Explosion" - https://youtu.be/t-8TDOFqkQA

    • j45 a day ago ago
    • wheelerwj a day ago ago

      100% this.

  • palata a day ago ago

    > But we have been here before. Predictions of this kind have been made ever since the emergence of the Internet

    I don't think I live in the same world as the author. Ever since the emergence of the Internet, "stuff related to IT" has been using more and more energy.

    It's like saying "5G won't use as much electricity as we are told! In fact 5G is more efficient than 4G". Yep, except that 5G enables us to use a lot more of it, and therefore we use more electricity.

    It's called the rebound effect.

    • onlyrealcuzzo a day ago ago

      If you're using more of it, because it's replacing corporate travel and going into the office and driving across town to see your friends and family and facetiming instead, then you are still MASSIVELY reducing your total energy.

      It's not like the majority of electricity use by computers is complete waste.

      You can poo-hoo and say I don't want to live in the digital world, and want to spend more time flying around the world to work with people in person or actually see my mom, or buy physical paper in stores that's shipped there and write physical words on it and have the USPS physically ship it, but that's just wildly, almost unfathomably, less efficient.

      If Google didn't exist, who knows how many more books I'd need to own, how much time I'd spend buying those books, how much energy I'd spend going to the stores to pick them up, or having them shipped.

      It's almost certainly a lot less than how much energy I spend using Google.

      While we all like to think that Facebook is a complete waste of time, what would you be spending your time doing otherwise? Probably something that requires more energy than close to nothing looking at memes on your phone.

      Not to mention, presumably, at least some people are getting some value from even the most wasteful pits of the Internet.

      Not everything is Bitcoin.

      • palata a day ago ago

        You also seem to live in a different world. I urge you to start getting informed on what needs to be done in order to build hardware (hint: it does not grow on trees).

        > then you are still MASSIVELY reducing your total energy.

        Instead of using all those caps, look at the numbers: we have them. We use more and more energy.

        > but that's just wildly, almost unfathomably, less efficient.

        Not sure if you really need the hint, but you shouldn't spend more time flying around the world.

        > It's almost certainly a lot less than how much energy I spend using Google.

        It is a fact that it isn't. Before Google, people were using less energy than we are now, period.

        > Probably something that requires more energy than close to nothing looking at memes on your phone.

        The industry that gets you your memes on the hardware you call phone is anything but "close to nothing" when it comes to energy. I would say that you are in bad faith, but with all those examples you've giving, it seems like you are just uninformed.

        So let me be blunt: your kids will most likely die because of how much energy we use (from one of the plethora of problems coming from that). At this point, we cannot do much about it, but the very least would be to be aware of it.

        • onlyrealcuzzo a day ago ago

          More energy because more people and more things are electrified & mechanical.

          A massive portion of the world was basically living in the stone age and has been lifted into middle class lives over the last 60 years.

          The population has also more than doubled.

          This is like comparing apples to apes.

          Sure, if you go back to when we were all monkeys, we are obviously using more energy per capita.

          If you go back to WW2, The West is using far less energy per capita, even when you account for imports. And again, that's far less energy to produce far better lives. And both of those tr ends are continuing every year.

          Sorry, you can't say, globally we use more energy, so every usage of energy is causing us to use more energy. It's not that simple.

          • palata a day ago ago

            > Sure, if you go back to when we were all monkeys, we are obviously using more energy per capita.

            We keep using more and more energy per capita, period. You can go back 10 000 years, 200 years or 100 years, it's the same.

            > If you go back to WW2, The West is using far less energy per capita, even when you account for imports.

            This is blatantly wrong.

            > Sorry, you can't say, globally we use more energy, so every usage of energy is causing us to use more energy. It's not that simple.

            It is that simple: what you wrote is called a tautology: we use more energy, so we use more energy. And every new usage of energy is causing us to use more energy.

            If you use more, you use more. How is that not simple? :-)

            • onlyrealcuzzo a day ago ago
              • palata a day ago ago

                Ok, you show me a line. Where does it explain what it measures? It says "Energy used per capita", it doesn't say what energy it accounts for.

                It most definitely does NOT account for the commute of the employees in China who worked on parts of your smartphone. Does it account for the use of TikTok? To what extent? Does it account for the AC in the datacenters used by TikTok outside the US?

      • wahnfrieden a day ago ago

        How do you account for overall energy use being up massively, and rising at record breaking pace

        • timschmidt a day ago ago

          According to the following references, most residential energy is used for heating and cooling. Most commercial energy is used for lighting, heating, and cooling. And most industrial energy is used in chemical production, petroleum and coal products, and paper production.

          1: https://www1.eere.energy.gov/buildings/publications/pdfs/cor...

          2: https://www.eia.gov/energyexplained/use-of-energy/industry.p...

        • rtuulik a day ago ago

          Its not. For the US, energy use per capita has been trending downwards since 1979. For the developing worlds, increase in energy usage is tied to increasing living standards.

          • palata a day ago ago

            > For the US, energy use per capita has been trending downwards since 1979

            It would be relevant if the US was completely isolated from the rest of the world. But guess what? The hardware you used to write this comment does not come from the US.

            Not taking into account the energy that went into building and transporting your hardware where you are currently using is... well wrong.

        • onlyrealcuzzo a day ago ago

          > How do you account for overall energy use being up massively, and rising at record breaking pace

          That has nothing to do with how much energy is spent on Google and the Internet vs how many more people there are, and how much more stuff the average person in developing economies has.

    • Arnt a day ago ago

      Nothing forces the rebound effect to dominate. Computers grow cheaper, we rebound by buying ones with higher capacity, but the overall price still shrinks. I bet the computer you used to post today cost much less than Colossus.

      Similarly, nothing forces AI or 5G to use more power than whatever you would have done instead. You can stream films via 5G that you might not have done via 4G, but you might've streamed via WLAN or perhaps cat5 cable instead. The rebound effect doesn't force 5G to use more power than WLAN/GBE. Or more power than driving to a cinema, if you want to compare really widely. The film you stream makes it comparable, not?

      • everdrive a day ago ago

        >Nothing forces the rebound effect to dominate.

        Human nature does. We're like a gas, and we fill to expand the space we're in. If technology uses less power, in general, we'll just use more of it until we hit whatever natural limits are present. (usually cost, or availability) I'm not sure I'm a proponent of usage taxes, but they definitely have the right idea; people will just keep doing more things until it becomes too expensive or they are otherwise restricted. The problem you run into is how the public reacts when "they" are trying to force a bunch of limitations on you that you didn't previously need to live with. It's politically impossible, even in a case where it's the right choice.

        • Arnt a day ago ago

          I don't understand why "we're like a gas, and expand to fill the space we're in". What makes the simile apply to e.g. AI or 5G when it doesn't apply to others, e.g. computer prices?

          • everdrive a day ago ago

            I think the Apple ARM chips are a good example. They're fantastically more efficient and fantastically powerful. We _could_ take this incredible platform and say "we can probably do personal computing on 3-5 watts if we really focus on efficiency." But we don't do that. With more powerful chips, websites and apps and operating systems will get less efficient, bigger, more bloated. If there's any slack in the system we'll just take it up with crap. The chips will be faster next year so why bother making things more efficient? Repeat this process forever, and we eat up all of our efficiency gains.

            • Arnt a day ago ago

              My Apple ARM laptop has 24+ hours battery lifetime in practice, three times as much as the best laptop I had in the 2000-2015 period, and it's lighter than most of those laptops too. (Can't remember the battery lifetime of my 2015-2019 laptop.) Clearly not all efficiency gains have been eaten up.

              • everdrive a day ago ago

                Agreed, it's not a perfect linear line, but if we how 2000-2015 computing requirements _and_ the Apple ARM efficiency we could be somewhere very special.

            • vel0city a day ago ago

              A lot of households did make the change from having a few hundred watt PC with a hundred watt monitor to a couple of phones and maybe a tablet that don't use anywhere near as much energy. They use those devices all day, but overall they use less power than a few hours a week of an older desktop.

      • bilekas a day ago ago

        > Similarly, nothing forces AI or 5G to use more power than whatever you would have done instead

        Am I missing something or has the need to vast GPU horsepower been solved ? Those requirements were not in DC's before and they're only going up. Whatever way you look at it, there's got to be an increase in power consumption somewhere no ?

        • Arnt a day ago ago

          Not necessarily, no.

          You can pick and choose your comparisons, and make an incease appear or not.

          Take weather forecasts as an example. Weather forecasting uses massively powerful computers today. If you compare that forecasting with the lack of forecasts two hundred years ago there obviously is an increase in power usage (no electricity was used then) or there obviously isn't (today's result is something we didn't have then, so it would be an apples-to-nothing comparison).

          If you say "the GPUs are using power now that they weren't using before" you're implicitly doing the former kind of comparison. Which is obviously correct or obviously wrong ;)

        • timschmidt a day ago ago

          GPU compute in datacenters has been a thing for at least 20 years. Many of the top500 have included significant GPU clusters for that long. There's nothing computationally special about AI compared to other workloads, and in fact it seems to lend itself to multiplexing quite efficiently - it's possible to process thousands of prompts for a negligable memory bandwidth increase over a single prompt.

          AI is still very near the beginning of the optimization process. We're still using (relatively) general purpose processors to run it. Dedicated accelerators are beginning to appear. Many software optimizations will be found. FPGAs and ASICs will be designed and fabbed. Process nodes will continue to shrink. Moore will continue to exponentially decrease costs over time as with all other workloads.

          • philipwhiuk a day ago ago

            > Moore will continue to exponentially decrease costs over time as with all other workloads.

            There's absolutely no guarantee of this. The continuation of Moore's law is far from certain (NVIDIA think it's dead already).

            • timschmidt a day ago ago

              > NVIDIA think it's dead already

              Perhaps that's what Jensen says publicly, but Nvidia's next generation chip contains more transistors than the last. And the one after that will too.

              Let me know when they align their $Trillions behind smaller less complex designs, then I'll believe that they think Moore's law is out of juice.

              Until then, they can sit with the group of people who've been vocally wrong about moore's law's end for the last 50 years.

              Our chips are still overwhelmingly 2D in design, just a few dozen layers thick but billions of transistors wide. We have quite a ways to go based on a first principles analysis alone. And indeed, that's what chip engineers like Jim Keller say: https://www.youtube.com/watch?v=c01BlUDIlK4

              So ask yourself how it benefits Jensen to convince you otherwise.

              • adgjlsfhk1 a day ago ago

                progress continues, but at far slower rates than they used to. nvidia has gained ~6x density in the past 9 years (1080 to 5090), while a doubling every 2 years would be >20x density in 9 years. the past 6 years (3090) are even worse with only a 3x of density

                • timschmidt a day ago ago

                  Moore's law says nothing about density.

                  "The complexity for minimum component costs has increased at a rate of roughly a factor of two per year."

                  Density is one way in which industry has met this observation over decades. New processes (NMOS, CMOS, etc) is another. New packaging techniques (flip chip, BGA, etc). New substrates. There's no limit to process innovation.

                  Nvidia's also optimizing their designs for things other than minimum component cost. I.e. higher clock speeds, lower temperatures, lower power consumption, etc. It may seem like I'm picking a nit here, but such compromises are fundamental to the cost efficiency Moore was referencing.

                  All data I've seen, once fully considered, indicates that Moore's law is healthy and thriving.

          • bilekas a day ago ago

            > GPU compute in datacenters has been a thing for at least 20 years. Many of the top500 have included significant GPU clusters for that long.

            Of course they've been a thing, but for specialised situations, maybe rendering farms or backroom mining centers but it's disingenuous to claim that there's not an exponential growth in gpu useage.

            • timschmidt a day ago ago

              Of course they've been a thing, but for specialized situations, maybe calculating trajectories or breaking codes but it's disingenuous to claim that there's not an exponential growth in digital computer usage.

              Jest aside, the use of digital computation has exploded exponentially, for sure. But alongside that explosion, fueled by it and fueling it reciprocally, the cost (in energy and dollars) of each computation has plummeted exponentially.

              • bilekas a day ago ago

                I really would like more of your data to show that, I think it would put this discussion to rest actually because I keep seeing articles that dispute it. At least older ones that ring bells, specifically https://epoch.ai/blog/trends-in-the-dollar-training-cost-of-...

                • timschmidt a day ago ago

                  You can find plenty of jumping off points for research here: https://en.wikipedia.org/wiki/Performance_per_watt

                  Along with this lovely graph captioned: "Exponential growth of supercomputer performance per watt based on data from the Green500 list." (note the log scale): https://en.wikipedia.org/wiki/Performance_per_watt#/media/Fi...

                  From the section about GPU performance per watt, I'll quote:

                  "With modern GPUs, energy usage is an important constraint on the maximum computational capabilities that can be achieved. GPU designs are usually highly scalable, allowing the manufacturer to put multiple chips on the same video card, or to use multiple video cards that work in parallel. Peak performance of any system is essentially limited by the amount of power it can draw and the amount of heat it can dissipate. Consequently, performance per watt of a GPU design translates directly into peak performance of a system that uses that design."

      • palata a day ago ago

        > Nothing forces the rebound effect to dominate.

        Not sure what to say to that. Yeah, it would be great if we didn't put so much resources into destroying our own world. I agree.

        The fact is that rebound effect very much dominates everything we do. I'm not saying it should, I'm saying it does. It's an observation.

        • Arnt a day ago ago

          I think you're telling me that graphs like these always point upwards, right?

          https://ourworldindata.org/grapher/per-capita-energy-use?tab...

          My own impression is that sometimes the aggregate total grows and sometimes it doesn't. And when it grows, sometimes that's because the rebound effect dominates.

          • palata a day ago ago

            Read the part in your link that says "Here, energy refers to primary energy".

            Primary energy in the US ignores the primary energy used in China for goods that end up being imported in the US.

      • Analemma_ a day ago ago

        There is some limit to the rebound effect because people only have so many hours in the day, but we’re nowhere near the ceiling of how much AI compute people could use.

        Note how many people pay for the $200/month plans from Anthropic, OAI etc. and still hit limits because they constantly spend $8000 worth of tokens letting the agents burn and churn. It’s pretty obvious that as compute gets cheaper via hardware improvements and power buildout, usage is going to climb exponentially as people go “eh, let the agent just run on autopilot, who cares if it takes 2MM tokens to do [simple task]”.

        I think for the foreseeable future we should consider the rebound effect in this sector to be in full force and not expect any decreases in power usage for a long time.

    • taeric a day ago ago

      Do we use more electricity because of 5G? I confess I'd assume modern phones and repeater networks use less power than older ones. Even at large.

      I can easily agree that phones that have internet capabilities use more, as a whole, than those that didn't. The infrastructure needs were very different. But, especially if you are comparing to 4G technology, much of that infrastructure already had to distribute content that was driving the extra use.

      I would think this would be like cars. If you had taken the estimates of how much pollution vehicles did 40 years ago and assume that that was going to be constant even as the number of cars went up, you'd probably assume we are living in the worst air imaginable. Instead, even gas cars got far better as time went on.

      Doesn't mean the problem went away, of course. And some sources of the polution, like tires, did get worse as total makeup as we scaled up. Hopefully we can find ways to make that better, as well.

      • palata a day ago ago

        > Do we use more electricity because of 5G? I confess I'd assume modern phones and repeater networks use less power than older ones. Even at large.

        If we did exactly the same with 5G than what we did with 4G, it would be more efficient.

        But why do we develop 5G? Because we can do more. It is more efficient, but we do much more, so we increase our energy consumption. This is called the "rebound effect". It's observed for every. single. technology.

      • aceazzameen a day ago ago

        As a data point, I turn 5G off on my phone and get several hours more battery life using 4G. I'm pretty sure the higher bandwidth is consuming more energy, especially since 5G works at shorter distances and probably needs the power to stay connected to cell towers.

        • catlikesshrimp a day ago ago

          HSPA++ Works for HN... But people want 8k TikTok

      • ElevenLathe a day ago ago

        The phones, towers, and networks are only the tip of the power iceberg. How much electricity are we burning to run the servers to service the requests that all these 5G phones can now make because of all the wonderfully cheap wireless connectivity?

    • Majestic121 a day ago ago

      This is countered in the article.

      "Yet throughout this period, the actual share of electricity use accounted for by the IT sector has hovered between 1 and 2 per cent, accounting for less than 1 per cent of global greenhouse gas emissions."

      • prewett a day ago ago

        But presumably the total use of energy has been going up, so while the relative percentage might be the same, I doubt very much that the absolute quantity of GWhr/year has stayed the same.

      • palata a day ago ago

        It's bad faith to talk about greenhouse gas emissions without taking into account the indirect contributions. When you use a computer, you cannot only account for the electricity that goes into the computer that is sitting on your desk.

        You have to account for all the energy that went into extracting the materials from the ground, building the electronics, shipping it accross the world, and then the electricity to operate it.

    • bicepjai a day ago ago

      Sounds similar to Jevons Paradox

    • a day ago ago
      [deleted]
  • JohnFen a day ago ago

    > By contrast, the unglamorous and largely disregarded business of making cement accounts for around 7 per cent of global emissions.

    Oh, that's not a good example of the point they're trying to make. The emissions from concrete are a point of major concern and are frequently discussed. A ton of effort is being put into trying to reduce the problem, and there are widespread calls to reduce the use of the material as much as possible.

    • dsr_ a day ago ago

      The only useful point that they make is that predictions about unending growth are always wrong in detail. Every actual hockey stick turns into a sigmoid, then falls. Meanwhile, a new hockey stick comes along.

      • Mistletoe a day ago ago

        But AI training has been behaving like Bitcoin mining, which constantly increases the difficulty. AI companies so far have been having to release costlier and costlier models to keep up with the Joneses. We don’t want the final iteration to be a Dyson sphere around the sun or the black hole at the center of our galaxy so Gemini 10,000 Pro can tell us “Let there be light.” Or maybe we do, I don’t know.

        • timschmidt a day ago ago

          DeepSeek has shown that significantly less costly training is possible when incentivized. Even for SOTA models.

        • Kye a day ago ago

          The previous 9,999 Geminis promised they'd solved entropy and said the words with no real effect so people stopped listening to it. It's very lonely now.

    • nerdponx a day ago ago

      Also modern infrastructure is literally built on concrete. Whereas the broad benefits of AI are dubious by comparison.

    • PTOB a day ago ago

      Has he considered exactly how much concrete is needed to build a datacenter campus?

      • Diggsey a day ago ago

        Essentially zero as a fraction of global concrete usage...

      • quickthrowman a day ago ago

        A 6” concrete slab (excluding footings) has ~18,500 cubic yards of concrete per million sqft (54 sqft of 6” slab is one cubic yard of concrete)

    • beepbooptheory a day ago ago

      In general there seems to be a big given in the argument that I don't think is obvious:

      > At the other end of the policy spectrum, advocates of “degrowth” don’t want to concede that the explosive growth of the information economy is sustainable, unlike the industrial economy of the 20th century.

      This seems to imply we all must agree that the industrial economy of the 20th century was sustainable, and that strikes me as an odd point of agreement to try to make. Isn't it just sidestepping the whole point?

  • pwarner a day ago ago

    Hopefully the panic continues and we get a lot of extra electricity, ideally via nuclear, wind, solar - and then if AI is a flop at least we get big progress on global warming.

    • blain a day ago ago

      I thought you will say a cheaper energy but global warming works too.

      Also its called climate change now.

    • thatguy0900 a day ago ago

      Notably this comes during a US administration with open, unironic hatred of all forms of clean energy for ideological reasons

    • wahnfrieden a day ago ago

      How does an urgent need for more energy use lead to overall cleaner energy? Won’t it also accelerate unclean energy use to saturation, even if additional clean sources are needed for capacity?

  • sollewitt a day ago ago

    “You may not know about the issue but I bet you reckon something, so why not tell us what you reckon. Let us enjoy the full majesty of your uninformed ad-hoc reckon” - David Mitchell.

    • cph123 a day ago ago

      "Let us enjoy the full majesty of your uninformed ad-hoc reckon, by going to bbc.co.uk… clicking on ‘what I reckon’ and then simply beating on the keyboard with your fists or head."

  • bobbyraduloff a day ago ago

    > But far from demanding more electricity personal computers have become more efficient with laptops mostly replacing large standalone boxes, and software improvements reducing waste.

    If only it was true, I reckon we’re using multiple-orders of magnitude more computational per $ of business objectives simply because of the crazy abstractions. For example, I know of multiple small HFT firms that are crypto market makers with their trading bots in Python. Many banks in my country have excel macros on top of SQL extensions on top of COBOL. We’ve not reduced waste in software but rather quite the opposite.

    I don’t think this is super relevant to the articles point but I think it’s an under discussed topic.

    • kalleboo a day ago ago

      Excel has already added a =COPILOT() function. Imagine the waste of all those formulas that probably amount to some basic mathematical formula that could be run on a 386.

    • whiplash451 a day ago ago

      > We’ve not reduced waste in software but rather quite the opposite.

      Indeed. But that is because we optimized (and are still optimizing) for speed of development, not much else.

  • maerF0x0 a day ago ago

    AI helped me fix my own car, no new parts, no driving to the stealership, no comfy lobby to light, no extra building to heat, no IT system to book me into...

    It's my opinion AI, like many technologies since the 1950s, will lead to more dematerialization of the economy meaning it will net net save electricity and be "greener".

    This is an extension of what steven pinker says in Enlightenment now.

  • Tycho a day ago ago

    What’s the energy profile of running inference in a typical ChatGPT prompt compared to:

      - doing a google search and loading a linked webpage
      - taking a photo with your smartphone and uploading it to social media for sharing
      - playing Fortnite for 20 minutes
      - hosting a Zoom conference with 15 people
      - sending an email to a hundred colleagues
    
    I’d be curious. AI inference is massively centralised, so of course the data centres will be using a lot of energy, but less centralised use cases may be less power efficient from a wholistic perspective.
    • JimDabell a day ago ago

      A ChatGPT prompt uses 0.3 Wh, which is approximately how much energy a Google search took in 2009.

      AI energy use is negligible compared with other everyday activities. This is a great article on the subject:

      https://andymasley.substack.com/p/a-cheat-sheet-for-conversa...

      The same author has published a series of articles that go into a lot of depth when it comes to AI energy and water use:

      https://andymasley.substack.com/p/ai-and-the-environment

      • danans a day ago ago

        > A ChatGPT prompt uses 0.3 Wh, which is approximately how much energy a Google search took in 2009.

        That's the number their CEO put out, but AFAIK it is completely unverified (they did not provide any background as to how it was calculated). To believe it is an article of faith at this point.

        What is concrete and verifiable are the large deals being struck between AI model providers and energy providers - often to be supplied via fossil fuels.

        • JimDabell a day ago ago

          > That's the number their CEO put out, but AFAIK it is completely unverified (they did not provide any background as to how it was calculated). To believe it is an article of faith at this point.

          Google also puts the median Gemini prompt at 0.24 Wh. The information available from different sources point in the same direction; you don’t have to take Sam Altman’s word for it. 0.3 Wh was the figure that was already pretty dependable before he said that.

          > What is concrete and verifiable are the large deals being struck between AI model providers and energy providers - often to be supplied via fossil fuels.

          Which is completely irrelevant to this discussion unless you quantify that in Wh per prompt. Vague “deals are being struck!” hand-wringing doesn’t add to the discussion at all. Why are you demanding absolute, unimpeachable rigour when vendors give specific figures, but are comfortable with hand waving when it comes to complaining about energy use?

          • danans a day ago ago

            > Why are you demanding absolute, unimpeachable rigour when vendors give specific figures, but are comfortable with hand waving when it comes to complaining about energy use?

            Because the evidence that AI data centers are using a lot of energy (generated by fossil fuels in particular) is observable in current concrete reality, like the xAI datacenters running gas turbines adjacent to a neighborhood in Memphis, TN:

            https://gasoutlook.com/analysis/xai-data-centre-emits-plumes...

            Companies like OpenAI, Google, and xAI have an incentive to downplay the energy usage of their facilities. If not, they should publish their methodology. The burden is on them to prove that they won't drive up energy prices and increase emissions.

            • JimDabell a day ago ago

              > AI data centers are using a lot of energy

              This is a worthless thing to say in the context of this discussion.

              How much is “a lot”? How many queries does that service?

              Saying that “data centres use a lot of energy” doesn’t tell us anything at all about their energy efficiency.

              > Companies like OpenAI, Google, and xAI have an incentive to downplay the energy usage of their facilities. If not, they should publish their methodology.

              All you’re really doing here is giving me the impression you aren’t participating in this discussion honestly. You demand more detail, but if you had read the things I pointed you towards, you would see that Google have published what you want already. You don’t actually care about the details, you want to complain without bothering to look at the information handed to you on a plate.

    • slfnflctd a day ago ago

      These are the kinds of questions we need pursued to develop better insight into the overall societal impact of current and near-future LLMs. Energy usage is a critical measure of any technology. The tradeoffs between alternate use cases should be modeled as accurately as possible, including all significant externalities.

  • afavour a day ago ago

    That's an awful amount of certainty for something that isn't backed by very much certainty at all. Just "previous claims about inefficiency in tech have ended up being incorrect".

    As a counterpoint: look at crypto. The amount of power used by cryptocurrency has _not_ gone down, in fact it's increased.

    • patapong a day ago ago

      While I don't disagree with your overall point, I don't think crypto is a good counterpoint here. Crypto is conditioned on using more and more energy to secure the network. As the value increases, more mining hardware can be thrown at it, which increases security adn thus value - there is no upper bound.

      AI on the other hand aims at both increased quality but also reduced energy consumption. While there are certainly developments that favour the latter at the cost of the latter (e.g. reasoning models), there are also indications that companies are finding ways to make the models more efficient while maintaining quality. For example, the moves from GPT-4 -> GPT-4-turbo and 4o -> 5 were speculated to be in the service of efficiency. Hopefully the market forces that make computing cheaper and more energy effective will also push AI to become more energy effective over time.

  • runako a day ago ago

    OpenAI yesterday announced[1] a partnership to deploy computer chips, but chose to denominate the size of the deal in gigawatts (instead of dollars, or some measure of computing capacity, or some measure of capability). They certainly seem to think about this in terms of electricity requirements, and seem to think they require a lot of it.

    (I may have the units off a bit, but it looks like OpenAI's recent announcement would consume a bit more than the total residential electricity usage of Seattle.)

    1 - https://openai.com/index/openai-nvidia-systems-partnership/

  • skybrian a day ago ago

    There are contrary trends: LLM’s are getting lots of efficiency improvements, but they’re being used more.

    Which is more important? Understanding what happened so far is impossible without data, and those trends can change. It depends on what new technologies people invent, and there are lots of smart researchers out there.

    Armchair reasoning isn’t going tell us which trend is more important in the long term. We can imagine scenarios, but we shouldn’t be very confident about such predictions, and distrust other people’s confidence.

  • stevenjgarner a day ago ago

    > Most of the increase could be fully offset if the world put an end to the incredible waste of electricity on cryptocurrency mining (currently 0.5 to 1 per cent of total world electricity consumption, and not normally counted in estimates of IT use).

    I do not accept this. It was once true under Proof-of-Work (typically ~1,000–2,000 kWh per transaction), not so much under Proof-of-Stake (typically 0.03–0.05 kWh per transaction).

    Note that proof-of-stake may actually have a lower energy footprint than credit card or fiat banking transactions. An IMF analysis [1] pegged core processing for credit card companies at ~0.04 kWh per transaction (based on data centers and settlement systems), but noted that including user payment means like physical cards and terminals could increase this by about two orders of magnitude—though even then, it doesn't extend to bank branches or employee overhead - an overhead not implicit in decentralized finance.

    [1] https://www.elibrary.imf.org/view/journals/063/2022/006/arti...

  • grmnygrmny2 a day ago ago

    The electricity aspect has always been one I wish AI skeptics/critics wouldn’t lean on so much, because it seems like the one most likely to be solveable. I think we should be focusing on the economic aspects instead - if all this pans out as boosters promise, how does our economy exist when all white collar jobs go away? Who buys all the stuff, with what money?

    • bluefirebrand a day ago ago

      I'm less concerned about the economy more concerned about social stability

      When people can't buy the stuff because they have no money, how do we stop the bloodbath that follows?

  • newscombinatorY a day ago ago

    Similar concerns were raised regarding the energy used to mine cryptocurrencies. It's basically wasted energy - no doubt about that. But this is different. Crypto's potential has been very limited all along, whereas generative AI has numerous potential uses, and we are seeing more and more companies, as well as ordinary people, utilising it.

    • thatguy0900 a day ago ago

      Great potential uses like the upheaval of the jobs market during a time of political crises lol. Great time to destroy the energy market for the common citizen as well

  • js8 a day ago ago

    I found this video https://youtu.be/IQvREfKsVXM interesting, especially because it mentions couple of AI studies/papers that argue in favor of much smaller (and more efficient) models. (And I have never heard of them.)

    I suspect that yes, for AGI much smaller models will eventually prove to be sufficient. I think in 20 years everyone will have an AI agent in their phone, busily exchanging helpful information with other AI agents of people who you trust.

    I think the biggest problem with tech companies is they effectively enclosed and privatized the social graph. I think it should be public, i.e. one shouldn't have to go through a 3rd party to make an inquiry for how much someone trusts a given source of information, or where the given piece of information originated. (There is more to be written about that topic but it's only marginally related to AI.)

  • danans a day ago ago

    > Looking the other side of the market, OpenAI, the maker of ChatGPT, is bringing in around $3 billion a year in sales revenue, and has spent around $7 billion developing its model. Even if every penny of that was spent on electricity, the effect would be little more than a blip.

    The electricity spend on AI datacenters won't be uniformly distributed. It will probably concentrate in areas that currently have cheaper (and dirtier) electricity, like what xAI is doing in Tennessee.

    That will likely drive up local energy prices in those places, which will be further exacerbated by the US's disinvestment in renewable energy and resulting increased reliance on high cost fossil fuels.

  • bob1029 a day ago ago

    I'd be very interested in seeing some kind of aggregated daily demand curve for AI workloads.

    It seems like a lot of the hyperbolic angles are looking at this as a constant draw of power over time. There is no reason for a GPU inference farm to be ramped up to 100% clock speed when all of its users are in bed. The 5700XT in my computer is probably pulling a mere 8~12W right now since it is just sitting on an idle desktop. A hyperscaler could easily power down entire racks based upon anticipated demand and turn that into 0W.

    • pixl97 a day ago ago

      With the current demand for GPU time it makes no sense to have your units idle. If you're not selling infrence to customers you'll batch out training to whoever will rent the excess time from you.

      Maybe in the future there will be idle time.

      • datadrivenangel a day ago ago

        The point is that training activity has cost more than the marginal cost of running the hardware, if not, it's better to keep it idle.

  • abosley a day ago ago

    As someone who has worked in hypercloud data center expansion and planning - I can confidently say this analysis is um, incomplete. W/out violating NDAs...I'd refer everone to multiple public articles talking about the situation in Ireland, where the hyperclouds are consuming well north of 22% of total energy production.

  • xbmcuser a day ago ago

    Personally I don't get some of the power investments by US tech companies. UAE last year signed a project agreement where it would get 1 GWH 24 hours a day from solar and battery project for around $6 billion . Prices for batteries are down a lot further now. So even with adding more battery and solar from more reliability they could build out solar and battery plants with where cost for electricity can be under $0.10/kwh

  • altcognito a day ago ago

    Many "new" expenditures replace existing stuff. The initial versions are often the worst iterations we'll see so even though the capability is going up, the energy usage will go down over time. It isn't universal (as we've seen a lot of new true growth), but it is common.

  • wrs a day ago ago

    I skimmed down as far as the Y2K bug being “obviously false” and closed the tab.

  • timeon a day ago ago

    Sorry for off-topic - is Substack competing with Medium on amount of pop-ups?

  • NoPicklez a day ago ago

    Not the greatest article, computing power consumption has absolutely increased and muddying it with greenhouse emissions isn't the point. It's still an increase in electricity, the way in which we supply it doesn't change that.

    The average desktop computer uses much more power than mentioned, you only need to look at premium desktop components to see how much extra power those components require than compared to previous years.

    High end graphics cards decades ago required only around 155w, nowadays the average GPU is pulling 300w and upwards of 400-500w. Data centers command significantly more power than they used to and will only increase with AI usage.

    The International Energy Agency shows that global data center electricity usage has almost doubled since 2020 and is projected to double again.

  • cratermoon a day ago ago

    This article was written over a year ago. How has the author's assessments worked out?

  • dheera a day ago ago

    Even if AI doesn't use more electricity, electric cars and clean energy flight will need it.

    • SketchySeaBeast a day ago ago

      That's my current, probably misguided hope. They couldn't justify getting the grid ready for electrical vehicles, but frame it as a way to make a bunch of money and everyone's going to jump on board.

      Of course, the fact that xAI is throwing up gas turbines at their data centres seems to indicate that clean energy isn't a given.

  • a day ago ago
    [deleted]
  • josefritzishere a day ago ago

    This claim is based on the idea that the use of AI will plateau. I hope that is true. The alternatives are ominous.

  • newsclues a day ago ago

    For humanity to continue increasing the quality of life for more people, more energy is required.

    • bilekas a day ago ago

      > For humanity to continue increasing the quality of life for more people, more energy is required

      I'm not 100% sure that's strictly true.. We naturally assume for the moment that more energy = more quality.

      It's like the Kardashev scale which basically says you can't advance without more and more energy consuptions to progress. Is this a proven thing ? Does the line need to always go up indefinitely ?

    • SketchySeaBeast a day ago ago

      At the risk of being called a luddite or a carriage driver, does the current iteration of AI actually increase quality of life that much?

  • ChrisArchitect a day ago ago

    (2024)

    Isn't this space a bit too fast moving to be submitting year old posts on it?

    Plenty of grid-draining articles since:

    Electricity prices are climbing more than twice as fast as inflation

    https://news.ycombinator.com/item?id=44931763

    Big Tech's A.I. Data Centers Are Driving Up Electricity Bills for Everyone

    https://news.ycombinator.com/item?id=44905595

    The U.S. grid is so weak, the AI race may be over

    https://news.ycombinator.com/item?id=44910562

    And nuclear ambitions:

    Microsoft doubles down on small modular reactors and fusion energy

    https://news.ycombinator.com/item?id=45172609

    Google to back three new nuclear projects

    https://news.ycombinator.com/item?id=43925982

    Google commits to buying power generated by nuclear-energy startup Kairos Power

    https://news.ycombinator.com/item?id=41840769

    Three Mile Island nuclear plant restart in Microsoft AI power deal

    https://news.ycombinator.com/item?id=41601443

    Amazon buys stake in nuclear energy developer in push to power data centres

    https://news.ycombinator.com/item?id=41858863

  • a day ago ago
    [deleted]
  • catlikesshrimp a day ago ago

    Datacenters didn't need water cooling before the AI explosion. (Air cooling was still possible)

    At first, DW's estimate was one drop of potable water was consumed for each query (normal queries, not more expensive ones)

    The Google, I don't know who allowed the sincerity, God bless him, released a first hand analysis of their water consumption, and it is higher that the one drop estimate: 5 drops

    https://services.google.com/fh/files/misc/measuring_the_envi...

    • a day ago ago
      [deleted]
  • more_corn a day ago ago

    I don’t believe it

  • vikramkr a day ago ago

    And what about the predictions of energy use that did pan out, like air conditioning and stuff? Also in 1999 how many personal computer companies were restarting nuclear power plants to fuel their projected energy consumption? Feels like a weird argument to make when the investments into AI I fra are literally measured in gigawatts. Feels like a weird argument in general - ai consuming lots of energy isn't some weird degrowth conspiracy theory

    • Mistletoe a day ago ago

      Let’s not forget Sam Altman tried to raise $7 trillion dollars for it somehow as well.

  • a day ago ago
    [deleted]
  • chakat a day ago ago

    [dead]

  • a day ago ago
    [deleted]