Computers are basically resistive heaters with extra steps. The watts going into them can't go anywhere else except be exhausted as heat. Conversation of energy and all that.
I have not done the math, but if the economic value of the compute is high enough, then it's probably cheaper to heat a home with compute compared to more efficient heat-pump technology. I think this must have been the case in the home bitcoin mining days.
Alternatively, if you are going to need the compute regardless (e.g. high-end gaming), and you need to heat your home anyway, then you might as well use the joules your PC is pumping out into your room.
The problem is heating is cyclical: we don't always need to heat our homes, yet compute wants to run at 100% 24/7 in order to extract as much value out of the upfront hardware costs. With other types of heating you can switch it off if you don't need it.
So the solution is probably not to have a rack in every home - you need a way to get rid of unneeded heat, which increases cost and complexity of an installation. Rather have a DC near a neighbourhood, with pumped district heating coming from the DC. If the homes don't need the heating then the DC can dump it in a conventional way.
During the Ethereum GPU mining craze I used ~10GPUs as a primary heat source for the whole house during the winter. It was awesome. For the first time I didn't have to care about using only 'cheap' electricity as it was not getting 'wasted' on heating only, but doing 'useful' work (mining). The only downside was the noise.
I once lived in a damp limestone house where a gust of wind would scatter papers off the kitchen table with all the doors and windows shut.
One winter night the temp dropped below 5°C for the third or fourth time in a century. Unprepared, I considered the usual options: head in the oven or write code. I chose code. I built a blanket fort, tinkered with Electron a bit, considered the oven option again, and finally got my space heater going:
FTFA: "Individuals had already begun using computers as a heat source by 2011."
I used 3 to heat my office back in 1987. I worked in the old section of the building, and three bosses competed to win me over to "their" computer systems. To keep myself warm, I kept all three turned on. And a cup of coffee between my legs.
In east-european communist coutries steam was piped from the industrial zones into the cities. Industrial waste heat was used to heat apartment buildings in cities. It was terrible, because there was no alternative heating in those buildings and heat delivered depended on industrial activity, not weather.
But today, waste heat from data centers could be great as a base heating source in nearby cities with local building heating only as a supplement/backup.
Computers are basically resistive heaters with extra steps. The watts going into them can't go anywhere else except be exhausted as heat. Conversation of energy and all that.
I have not done the math, but if the economic value of the compute is high enough, then it's probably cheaper to heat a home with compute compared to more efficient heat-pump technology. I think this must have been the case in the home bitcoin mining days.
Alternatively, if you are going to need the compute regardless (e.g. high-end gaming), and you need to heat your home anyway, then you might as well use the joules your PC is pumping out into your room.
The problem is heating is cyclical: we don't always need to heat our homes, yet compute wants to run at 100% 24/7 in order to extract as much value out of the upfront hardware costs. With other types of heating you can switch it off if you don't need it.
So the solution is probably not to have a rack in every home - you need a way to get rid of unneeded heat, which increases cost and complexity of an installation. Rather have a DC near a neighbourhood, with pumped district heating coming from the DC. If the homes don't need the heating then the DC can dump it in a conventional way.
Yep. These various “tiers” have already been considered and are described in the Wikipedia article.
During the Ethereum GPU mining craze I used ~10GPUs as a primary heat source for the whole house during the winter. It was awesome. For the first time I didn't have to care about using only 'cheap' electricity as it was not getting 'wasted' on heating only, but doing 'useful' work (mining). The only downside was the noise.
I once lived in a damp limestone house where a gust of wind would scatter papers off the kitchen table with all the doors and windows shut.
One winter night the temp dropped below 5°C for the third or fourth time in a century. Unprepared, I considered the usual options: head in the oven or write code. I chose code. I built a blanket fort, tinkered with Electron a bit, considered the oven option again, and finally got my space heater going:
Worked surprisingly well.FTFA: "Individuals had already begun using computers as a heat source by 2011."
I used 3 to heat my office back in 1987. I worked in the old section of the building, and three bosses competed to win me over to "their" computer systems. To keep myself warm, I kept all three turned on. And a cup of coffee between my legs.
In east-european communist coutries steam was piped from the industrial zones into the cities. Industrial waste heat was used to heat apartment buildings in cities. It was terrible, because there was no alternative heating in those buildings and heat delivered depended on industrial activity, not weather.
But today, waste heat from data centers could be great as a base heating source in nearby cities with local building heating only as a supplement/backup.