theamk
12 hours ago
"leaking" is the wrong word here - it implies some sort of inefficiency, process which is not working as well as it needs to. Leaky bucket, leaky faucet...
That's not the case here, that center is __dumping__ heat into environment - it is by design, all that electricity is being converted into the heat. By design, it's enormous electric heater.
dangalf
11 hours ago
Technically it is inefficiency. The electricity should be doing computer things. Heat is wasted electricity. Just there's not much the data centre could do about it.
stouset
10 hours ago
Even if the computer does perfectly-efficient computer things with every Joule, every single one of those Joules ends up as one Joule of waste heat.
If you pull 100W of power out of an electric socket, you are heating your environment at 100W of power completely independent of what you use that electricity for.
spyder
5 hours ago
Only true for our current computers and not true with reversible computing. With reversible computing you can use electricity to perform a calculation and then "push" that electricity back into a battery or a capacitor instead of dumping it to the environment. It's still a huge challenge, but there is a recent promising attempt:
"British reversible computing startup Vaire has demonstrated an adiabatic reversible computing system with net energy recovery"
https://www.eetimes.com/vaire-demos-energy-recovery-with-rev...
Short introduction video to reversible computing:
flave
5 hours ago
Actually pretty cool - I was about to comment “nice perpetual motion machine” but looked into a bit more and it’s much more interesting than that (well, a real perpetual motion machine would be interesting but…)
Thanks for posting. Pretty cool.
anthonj
2 hours ago
This violates energy conservation principles. Some power will be "wasted" into heat, some other will be used for some other work.
tasuki
5 hours ago
> every single one of those Joules ends up as one Joule of waste heat.
Yes it ends up as heat, but with some forethought, it could be used to eg heat people's homes rather than as waste.
agumonkey
5 hours ago
These days it's not rare to have data center heated buildings. I guess crypto bros are just not thinking about this. But technically if could be done there too.
KellyCriterion
20 minutes ago
There was a startup in EU which explicitly sold heat from crypto mining to the local energy provider. IIRC it was also here on hacker news some time ago.
agumonkey
8 minutes ago
Qarnot maybe
TheSpiceIsLife
5 hours ago
You can say that about any waste heat.
In really, it’s not convenient to move all waste heat to where it’s more needed.
m4rtink
an hour ago
Modern industrial scale insulated hot water district heating systems can do dozens of kilometers with the water cooling down only by a degree Celsius.
tremon
3 hours ago
It's always more convenient to ignore externalities. That doesn't mean we should be okay with only bottom-of-the-barrel solutions.
thegrim000
8 hours ago
I read it as the inefficient part isn't the compute efficiency, the inefficient part is dumping all the resulting heat into the environment without capturing it and using it in some way to generate electricity or do work.
On a related/side note, when there's talk about seti and dyson spheres, and detecting them via infrared waste heat, I also don't understand that. Such an alien civilization is seemingly capable of building massive space structures/projects, but then lets the waste heat just pour out into the universe in such insane quantities that we could see it tens/hundreds of light years away? What a waste. Why wouldn't they recover that heat and make use of it instead? And repeat the recovering until the final waste output is too small to bother recovering, at which point we would no longer be able to detect it.
oasisaimlessly
7 hours ago
All energy inevitably changes into heat eventually, and in the steady state, power in = power out.
There is no way to get rid of heat. It has to go somewhere; otherwise, the temperature of the system will increase without bound.
robkop
10 hours ago
Interesting question - how much will end up as sound, or in the ever smaller tail of things like storing a bit in flash memory?
Workaccount2
9 hours ago
Heat is the graveyard of energy. Everything that uses energy, or is energy, is actually just energy on it's way to the graveyard.
The energy of the universe is a pool of water a top a cliff. Water running off this cliff is used to do stuff (work), and the pool at the bottom is heat.
The "heat death of the universe" is referring to this water fall running dry, and all the energy being in this useless pool of "heat".
devsda
6 hours ago
Do thermophotovoltaic cells operate on different kind of heat?
Is it impossible to convert heat into other forms of energy without "consuming" materials like in the case of steam, geothermal or even the ones that need a cold body to utilize thermoelectric effect.
LiamPowell
6 hours ago
TPVs don't rely solely on the temperature of an object being high, they instead rely on two objects on either side having different temperatures. As heat moves[1] from one side to the other some of the energy from that movement is turned in to electricity.
[1]: Technically the movement itself is heat, the objects don't contain heat, rather they contain internal energy, but the two get mixed up more often than not.
supermatt
4 hours ago
That movement is effectively “consuming” the differential.
ajuc
4 hours ago
What thermal energy sources actually exploit is temperature difference, not heat. And in the end that difference averages out.
phil21
9 hours ago
Almost none. A long time ago a friend and I did the math for sound, photons (status LEDs), etc and it was a rounding error of 1% or something silly like that.
And that’s ignoring that sound and photon emissions typically hit a wall or other physical surface and get converted back to heat.
It all ends up as heat in the end, just depends on where that heat is dumped and if you need to cool it or not. Most watts end up being even more than the theoretical heat per watt due to said cooling needs.
There is literally no way around the fact that every watt you burn for compute ends up as a watt of waste heat. The only factor you can control is how many units of compute you can achieve with that same watt.
Terr_
7 hours ago
Well, at least until somebody devises a system that transports or projects it so that the heat ends up somewhere not-Earth. It'd still be heating the universe in general, of course, even in the form of sprays of neutrinos.
That reminds me of a sci-fi book, Sundiver by David Brin, where a ship is exploring the sun by firing a "refrigerator laser" to somehow pump-away excess heat and balance on the thrust.
mrDmrTmrJ
9 hours ago
All sound will end up as heat.
csomar
2 hours ago
Theoretically, if your computation is energy efficient, you won't need any electricity at all since the real computation costs zero energy.
usrnm
5 hours ago
If I turn my fan on and 100% of the electricity is converted to heat, where does the kinetic energy of moving fan blades come from? Even the Trump administration cannot just repeal the law of conservation of energy.
jo909
5 hours ago
While spinning, the blades store a miniscule amount of kinetic energy.
After removing power even that small amount ends up as heat through friction ( both in the bearing but mostly in the air turbulence). And the blades end up in the same zero energy state: sitting still.
So it is correct that a 100% "end up" as heat
usrnm
4 hours ago
Most of that energy gets transfered to the air that's being moved by the blades, and who knows what that air does eventually. And we're not even talking about the plant growing light that might be sitting in my room near my house plants literally creating new life from electricity.
numb7rs
5 hours ago
Even if most of the energy goes into kinetic energy of the air, that air will lose momentum via turbulence and friction with the surrounding air, which will end up as... heat.
mr_toad
11 hours ago
There’s a minimum level of energy consumption (and thus heat) that has to be produced by computation, just because of physics. However, modern computers generate billions of times more heat than this minimum level.
RhysU
11 hours ago
It'd be super fun to take that as an axiom of physics then to see how far upwards one could build from that. Above my skills by far.
ruined
9 hours ago
it's called the first law of thermodynamics
UltraSane
10 hours ago
The minimum amount of energy needed to compute decreased asymptotically to 0 as the temperature of space goes to 0. This is the reason a common sci-fi trope where advanced civilizations hibernate for extremely long times so that they can do more computation with available energy.
ctmnt
9 hours ago
That’s a common trope? Can’t say I’ve run into it. But I’d like to! What are some good examples?
Supermancho
7 hours ago
In the book Calculating God, a character notes that this is a common civilization-wide choice. Living in virtual reality, rather than trying to expand into the vast expanses of space, is a common trope as much as it's a logical choice. It neatly explains the Fermi Paradox. In some fiction, like The Matrix, the choice might be forced due to cultural shifts, but the outcome is the same. A relatively sterile low-energy state civilization doing pure processing.
ithkuil
4 hours ago
I wonder if it's illogical to think that all civilizations must always pick the most logical of the options
triMichael
7 hours ago
Kurzgesagt just made a video on it a couple months back: https://www.youtube.com/watch?v=VMm-U2pHrXE
CamperBob2
8 hours ago
Here you go: https://pastebin.com/raw/SUd5sLRC
And it only cost 0.006 rain forests!
UltraSane
8 hours ago
geoffschmidt
11 hours ago
Heat is not by itself waste. It's what electricity turns into after it's done doing computer things. Efficiency is a separate question - how many computer things you got done per unit electricity turned into heat.
anon84873628
6 hours ago
How many computer things you got done per unit electricity, and how many mechanical things you do with the temperature gradient between the computer and its heat sync.
For example, kinda wasteful to cook eggs with new electrons when you could use the computer heat to help you denature those proteins. Or just put the heat in human living spaces.
(Putting aside how practical that actually is... Which it isn't)
eimrine
6 hours ago
Good luck with collecting that heat from air.
anon84873628
6 hours ago
I think what they mean is that there is not a Carnot engine hooked up between the heat source and sink. Which theoretically something the data center could do about it.
charcircuit
5 hours ago
The electricity is doing computer things, building bitcoin blocks.
YetAnotherNick
5 hours ago
No its not. It would be waste only if the there is a high temperature gradient, which is minimized in mining operation through proper cooling.
It's that computation requires electricity. And almost all of the heat in bitcoin mining comes from computation, technically changing transistor state.
sixtyj
5 hours ago
They could make a second floor with eggs and newborn chicken. /s
Lerc
4 hours ago
Well It it's using solar power it's just moving heat from one place to another.
I guess, if it's using fossil fuel to generate power it's also just moving heat from one place to another, but really really slowly. The relevant factor there is that the long term storage was performing a important secondary function of holding a lot of co2.
It's in Texas, surely that's an area amenable to solar production. What are they actually using there.
fainpul
4 hours ago
> By design, it's enormous electric heater.
You're right, it's not leaking, it's dumping excess heat on purpose.
However, I get triggered whenever someone uses the term "by design" wrongly. The generation of heat is not by design. It's an undesired side-effect of the computing being done. "By design" would mean that someone decided that there should be a certain amount of heat generation and made sure that it happens.
Most often I see this term misuse from developers who explain bugs as being "by design". It happens when two features interact in an undesired way that creates problems (a bug). Developers like to look at feature A in isolation, determine that it works as designed, then look at feature B, determine that it also works as designed, then they look at and understand the interaction between feature A and B and since they now understand what is happening, they claim it's "by design". However, nobody ever decided that feature A and B should interact this way. It was clearly an oversight and every normal person would agree that the interaction is undesired and a bug. But the developer says "won't fix, this is by design". Infuriating!
baruchel
2 hours ago
When you compute some nice and elegant result, dissipated heat is an undesired side effect. But let's face it: we are speaking about proof of work. Proof of work means that a computed has run during some "required" time. In other words, you have to prove that enough heat has been dissipated. Waste of energy actually is "by design" here.
userbinator
12 hours ago
I wonder if there's enough heat being produced for it to act as a district heating plant.
hephaes7us
11 hours ago
There absolutely is, but of course it's nonzero cost to capture.
adonovan
8 hours ago
Also, the temperature is not high enough (compared to the steam coming out of a gas/oil/nuclear plant) to obtain much work from the waste heat.
tonyarkles
8 hours ago
That is 100% the issue. This is really low quality heat. Making it better would require even more energy input (e.g. a heat pump) because we can’t safely run electronics hot enough to generate high quality process heat.
duskwuff
10 hours ago
Rockdale is a small town of ~5000 residents. Even if it were practical to install district heating - which I don't think it is - there certainly isn't demand for hundreds of megawatts of it.
kelnos
10 hours ago
I would definitely call that an inefficiency. Heat is wasted energy that in theory could be turned into useful work. The electricity used that created that heat (that is, not including the electricity that "went to" the computations themselves) ended up serving no useful purpose.
It would be wonderful if we could capture that waste heat and give it a useful purpose, like heating homes, or perhaps even generating new electricity.
(And this is before getting into the fact that I believe mining cryptocurrency is a wasteful use of electricity in the first place.)
tbrownaw
10 hours ago
> The electricity used that created that heat (that is, not including the electricity that "went to" the computations themselves) ended up serving no useful purpose.
Computational results do not contain stored potential energy. There is no such thing as energy being "used up" doing computation such that it doesn't end as waste heat.
stouset
10 hours ago
> wasted energy that in theory could be turned into useful work
Even if turned into useful work, the end result of that work is still ultimately heat.
cousin_it
7 hours ago
Right, but if it's noticeably hotter than the environment, then that temperature difference could be used to drive a heat engine and get some more useful work. So the knee-jerk response "omg, we see the heat from space? it's gotta be wasteful!" is kind of correct, in theory.
anon84873628
6 hours ago
Some people are saying "waste heat" in the technical sense of "the heat my industrial process created and I need to get rid of" and others are saying "waste heat" as "heat humans are emitting into space without slapping at least one Carnot engine on it yet".
kiba
10 hours ago
All computations eventually becomes heat. There is no computers that doesn't generate heat.
We can generate less heat per computation but it eventually cannot be avoided.
EGreg
9 hours ago
Seriously, the only way I would accept Bitcoin mining is as a winter heating source that pays for itself. Why can't people sell those?
yellowcake0
4 hours ago
The economics of bitcoin mining dictate that the work must have no other utility. If you increase its profitability by using it for auxiliary winter heating, then more people will mine bitcoin until there is an oversupply of heating and we return to the current equilibrium amount of "wasted" heat.
wmf
8 hours ago
Those exist but they're too expensive to pay back their cost. And heat pumps are 3x the efficiency of resistive heating.
anon84873628
6 hours ago
Unfortunately, Bitcoin mining is pretty hot as far as regular computer use goes, but not very hot at all compared to burning some fuel.
akimbostrawman
3 hours ago
this obvious disingenuous framing is clearly on purpose to manipulate
timeon
12 hours ago
Not sure what your point is. With POW inefficiency is by design.
edoceo
12 hours ago
You got the point. It's "by design" - you've both said it.
nh23423fefe
12 hours ago
a home can leak heat to the environment because of bad insulation. a datacenter doesn't leak heat because leaking is normatively bad.