Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

I keep hearing people claiming that water is just as much as issue as energy for operating these DCs, but that just doesn't make any sense to me. However, I haven't had to step inside a DC for almost two decades.


Continuously dissipating 1 gigawatt of energy by boiling room temperature water would require approximately 1.38 million liters of water per hour.

Seems like the environmentally responsible thing to do be to build the datacenter near the coast and use the waste heat to desalinate water. Or at least dissipate the heat into the ocean rather than boiling off an inland freshwater supply.


And kill the local aquatic life as you raise the temp beyond their happy place?


Setting aside a small patch of ocean for the task seems like a much better plan than the current practice. Provided you dump it in a place with a decent current any adversely affected area should be exceedingly small.

Keep in mind that the sun is constantly dumping energy on us. Absorption averaged across the entire earth is ~200 W/m^2. Assuming I didn't misplace some zeros somewhere then a gigawatt corresponds to ~5 km^2 of ocean surface. That's the daily flux. Penetration falls off exponentially so 75% of that only ever makes it ~10 m down.

I think the takeaway here is the utterly incomprehensible scale of the ocean.


run it through a turbine and generate electricity to power the datacenter - infinite energy and infinite ai unlocked.


This idea is probably more worth it in middle eastern countries given that 90% of their water comes from Desalination Plants. But given the recent war within region, I don't really expect Datacenters to be built within the region for quite a long time.




Consider applying for YC's Summer 2026 batch! Applications are open till May 4

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: