> Don't these data centers have pretty elaborate cooling setups that use large volumes of water?
Depending on where, and (more importantly) when you last read about this, there's been some developments. The original book that started this had a unit conversion error, and the reported numbers were off by about 4500x what the true numbers are (author claimed 1000 times more water than an entire city consumption, while in reality it was estimated at ~22% of that usage).
The problem is that we're living in the era of rage reporting, and corrections rarely get the same coverage as the initial shock claim.
On top of this, DCs don't make water "disappear", in the same way farming doesn't make it disappear. It re-enters the cycle via evaporation. (also, on the topic of farming, don't look up how much water it takes to grow nuts or avocados. That's an unpopular topic, apparently)
And thirdly, DCs use evaporative cooling because it's more efficient. They could, if push came to shove, not use that. And they do, when placed in areas without adequate water supply, use regular cooling.
I always find water use and farming weird. Living in part of planets where water for farms mostly if not fully rains down from the sky. So it getting on farm land is inconsequential one way or an other.
Still, I do feel there must be some difference between farming and cooling use by evaporation. As at least part of water is run off back to rivers and then seep back to ground water. These again depend largely on location.
I have no idea what book you're talking about, and I never claimed water "disappears" or made any argument about consumption statistics. Why would you assume I think water vanishes from existence? That's absurd.
My point is simple: the utility infrastructure is the hard part. The silicon sitting on raised floors is disposable and will be obsolete in a few years. But the power substations, fiber connections, and water infrastructure? That takes years to permit and build, and that's where the real value is.
Building that infrastructure (trenches for water lines, electrical substations, laying fiber) is the actual constraint and where the long term value lies. Whether they're running GPUs or something else entirely, industries will pay for access to that utility infrastructure long after today's AI hardware is obselete.
You're lecturing me about evaporative cooling efficiency while completely missing the point.
Sorry if it came out that way, it was not my intention. I just thought you asked and provided some info that I'd recently read.
Water usage of the DC itself can vary a lot. If they're in an area where clean water is cheap, then they might use evaporative cooling which probably has the most significant water consumption (by volume and the fact that it's been processed to be safe to drink). In other areas they may use non-potable water or just a closed loop water system where the water usage is pretty negligible. The electricity is going to be the much larger consideration on the larger scale (though still affected by local grid capacity). Also, the capital cost is a very significant part of these systems: there's a pretty big gap in pricing between 'worth building' and 'worth keeping running'.
(I recommend this video by Hank Green on the subject: https://www.youtube.com/watch?v=H_c6MWk7PQc . Water usage of data centers is a complex and quite localized concern, not something that's going to be a constant across every deployment)
Semiconductor manufacturing might make sense here but I also don't think it might not simply because it would require probably a lot of expertise and knowledge and complex machinery with experience in this industry which I assume would be very hard to gather even for these datacenters.
I don't see any reasonable path moving forward for these datacenters for the amount of money that they have invested.
Semiconductor manufacturing needs supply chain a lot more than it needs fast internet. Wafers, fine chemicals, gases, consumable parts. A lot of this comes from petroleum refining, so it helps to be near a lot of refineries, although not enough to be decisive in site selection.
Agreed. Your point is true and as such too I don't really think that they could really be used for semiconductor industry.
And all other industries also don't really seem to me to have any overlap with the datacenter industry as much aside from having water access and land and electricity but like I doubt that they would get used enough to be justified their costs, especially the costs of the overpriced GPU's and ram and other components
In my opinion, These large datacenters are usually a lost cause if the AI bubble bursts since they were created with such a strong focus of GPU's and other things and their whole model of demand is related to AI
If the bubble bursts, I think that auctioning server hardware might happen but I doubt how much of that would be non-gpu / pure compute related servers or perhaps gpu but good for the average consumer.