Hacker News new | past | comments | ask | show | jobs | submit login
The surging demand for data is guzzling Virginia's water (grist.org)
36 points by rntn 13 days ago | hide | past | favorite | 55 comments





From the linked study: [1]

Total operational water "footprint" of data centers in 2018: 5.13 * 10^8 m^3. Of that, indirect consumption through electricity generation: 3.83 * 10^8 m^3, or 74.7%. And indirect consumption related to water and wastewater utilities (non-electicity production): 4.50 * 10^5 m^3, or 0.09%.

The study then does a reasonable job of discussing the concerns of building data centers in water-stressed places.

This article seems to be another "didn't read the study, saw big number" thing.

[1] https://iopscience.iop.org/article/10.1088/1748-9326/abfba1 (PDF page 7, section 3.1)


Okay that makes more sense. I was skeptical of the article's claim that the water was just being heat-vaporized into oblivion.

That said, more water-sustainable power generation is clearly still a need.


If ISPs weren’t so stingy with providing symmetrical, >10 Gbps, connections to residential homes. Then we wouldn’t need to have centralized data centers that consume massive amounts of resources in a specific region. Instead we could have distributed these resources amongst the entire country. Create local jobs (imagine a local “data center” that serves resources to people of {city}, {state}.

The internet should be distributed instead we have concentrated the vast majority of it into these “cloud” providers which subsequently hoard the wealth (ie, aws subsidizing the losses of the warehouse division), redistribute local resources to the top C-level executives and shareholders.

Jobs outsourced (or “in sourced”) to people that have no connection. No care about the quality. Only care about fake initiatives and “mission accomplished” type milestones.


I don't think this would solve the problem like you hope. Case in point, Netflix has this setup already for streaming video. Tons of small racks all over the world. It's a huge pain to maintain and was only worth it because of the huge difference in customer experience.

And even though they already have it, they still use AWS for all the control plane work, because it's just so much easier administratively. Admining distributed systems in just a few datacenters is hard enough. Doing it across thousands would be many orders of magnitude worse.

Heck, most companies can't even handle two datacenters.


I am not aware of what Netflix has been doing. Care to share any docs they have published or is this “insider” information?

It's not a secret, they explain it all here: https://openconnect.netflix.com/en/

How many companies need netflix scale? A handful. So much of life online is not massive scale to general public streaming video with minimal latency.

The cloud centralization in massive data centers is like CAFOs for computing.


I think you're missing the point. Even a company with that scale and resources has a hard time handling distributed systems. Even huge enterprises still use one datacenter because they don't have the talent and/or desire to do more than one.

So expecting everyone to solve those hard distributed systems problems is untenable. That's why everyone is centralizing in massive datacenters.


> Large data centers are resource hogs, using as much as 5 million gallons of water a day.

While this is a lot of water, but as a point of comparison typical golf course uses about 100-200 million gallons of water a year. It's still a drop in the bucket compared to agriculture.


You switched from gal/day to gal/year, converting shows golf courses using a roughly 10th of the same amount of water in a day ~500,000 gal.

I'm fully aware. But I'm quoting the article and golf courses don't use a consistent amount of water day to day, so I am trusting readers to do the math.

Presumably the demand for golf courses isn’t growing at the same rate as datacenters.

I believe that it has actually dropped since 2000, with double-digit percentages of courses closing.

Not defending the water usage of golf courses, but 5 million a day is a lot more water than 100-200 million per year. I guess you could say data centers provide a tangible utility while golf courses do not.

In fairness, golf courses help replenish aquifers in their area, and not all of them are sourced from “fossil water”. My city in the SE USA gets its water from a surface supply, so as long as the river through town is flowing, we’re good. The best golf course in the area backs up to said river; I don’t know for sure where they get their water, but since it’s essentially agricultural use, there’s no need to treat it prior to use. I would be surprised if they don’t just pump it directly from the river to the course.

It’s all going to end up in the same river after the fact anyway. Not all of us live in near-deserts.


If the data centers are being used to ship mostly garbage entertainment video, then I would say the utility of both is similarly "tangible" (whatever that means), but the datacenter obviously delivers orders of magnitude more utility hours (to orders of magnitude more people).

I think more importantly, a large regional T1 data center probably supports a region of the US that includes hundreds if not thousands of golf courses.

5 million gallons is approximately the amount the Mississippi river discharges in one second.

I've heard of data centers on ships or on the water, someone told me the internet wasn't green (big shock to me at that point) and what they're doing. https://news.microsoft.com/source/features/sustainability/pr... https://www.datacenterdynamics.com/en/news/nautilus-data-tec...

Not sure why they aren't more standard.


Is there some reason data centers can't/don't use a closed loop cooling system?

You get much more efficiency out of the system when you lose the water to evaporation. It takes all of the heat with it.

I know some data centers are experimenting with closed loops or wastewater systems, but while water is still priced so cheap in the US they are not in a huge rush.


It's expensive to build out and maintain. You also have to deal with leaks.

But actually a lot of newer spaces are moving to closed loop because it's the only way to get enough cooling for the GPUs. The newest NVIDA GPUs require it.


We really need to figure out ways to get big fat data pipes to places that are naturally cold. Like imagine if we could get big pipes to Antarctica and then build data centers there. Plenty of fresh water and cold enough that you could almost just cool it with outside air, and you can exhaust the water right back outside to freeze back up. You could also use the exhaust heat to warm up the living spaces nearby.

The only question would be, would it create so much heat in one spot that it would actually warm up the environment too much?

The main thing stopping this now is lack of high throughput low latency data to the continent.


Minor detail: The myriad megajoules of heat which any major DC has got to breathe out...were previously inhaled as electrical energy. And multi-megawatt electrical service is quite difficult to source in most extremely cold places.

While not “extremely” cold, Quebec’s hydroelectric dams are mostly in the central/north of the province. I think Norway is similar to the point where electricity pricing is a lot cheaper in the north because of limited transmission capacity.

Both would be better choices than Virginia.


You could use solar panels for 1/2 the year and wind all year. Some stations are doing that today.

But yes, this is a major concern. A few modern nuclear plants would solve the problem, but you'd have to build them. It wouldn't be cheap.



Yes, but saltwater is very corrosive. Also they were hard to get to. A datacenter in the arctic regions would at least be accessible -- you could build attached housing and heat it with waste heat.

Lake Ontario has 4C water year round not far from shore.

Just need a big enough and deep enough lake in a temperate climate.

Not sure when all that added heat becomes a problem, but it’s gotta get dumped somewhere.


Building things on permafrost always ends badly, that is huge problem in Alaska where they have to constantly re-level buildings and build things on pylons.

> The only question would be, would it create so much heat in one spot that it would actually warm up the environment too much?

Probably not at a large scale, although similar situations have led to odd headlines like 'Florida's manatees are addicted to power plants'.

https://www.bbc.com/future/article/20240328-floridas-manatee...


Building on permafrost is actually tricky with just residential levels of heat - houses are sometimes built on stilts in order to avoid melting the ground under them and making it unstable.

But I like where your head is at. Maybe an offshore platform? Alternatively, there are lots of places that aren't permafrost-cold which are nonetheless quite cool underground year round. I wonder how big a geothermal system would have to be to be an effective heat sink for a data center.


> I wonder how big a geothermal system would have to be to be an effective heat sink for a data center.

Ah. 500 feet of pipe per 3.5 kW. That might be why they don't do it that way. XD


Oh! Or when they do, it involves a 35 acre lake.

https://betterbuildingssolutioncenter.energy.gov/showcase-pr...


Or a 7000 acre reservoir for a nuclear plant: https://en.m.wikipedia.org/wiki/South_Texas_Nuclear_Generati...

>The only question would be, would it create so much heat in one spot that it would actually warm up the environment too much?

I doubt it, the bigger issue would be Global Warming, but

What you said makes sense on paper, but the center will sink into the ice quickly. I remember reading that a base down there had to be moved due to it sinking.


Isn't Antarctica melting already? We need to slow down damn it.

Microsoft is already working on undersea data centers.

I assume some sort of geothermal(including water based) could be used in many regions already.


This is how the first chapter would start in a book where somebody builds a death ray in Antarctica.

The number and scale of data centers around Dulles is just crazy. The power consumption must be absurd.

I live nearby and whenever I drive/cycle through the area, I have to laugh a little. In person, it looks just as absurd as it sounds or looks on a map. $1 million homes surrounded by giant windowless hulks. It's possibly the most dystopian place in DC metro.

This article largely ignores the fact that there are common data center designs that use closed-loop cooling, and consume no more water on a daily basis than a barber shop.

It's not even hard to build them that way.


And here I am paying for energy to heat my house like a sucker, would be neat if we co-located data centers in cities and then captured/transported the heat to homes/buildings

Facebook's data centre in Denmark does this.

https://www.ramboll.com/extract-heat-from-data-centres/large...


Is that picture at the top real? It looks like aspirational pre-planning concept art!

I'm not certain, but it looks reasonable for Danish architecture.

I think it's here, but the StreetView photographs are old: https://maps.app.goo.gl/5M9pwoFgqceDXGtk7


Can't they use saltwater? Or grey water? What do coastline nuclear plants do?

Not coastline, but the Lake Anna facility in central VA uses lake water for cooling. It's a two-loop system - internal loop touches the radioactive bits, which is pumped to heat exchangers, and an external loop ingests cool lake water and dumps warmer water back into the lake.

The "warm" side of the lake has more valuable housing because the lake remains usable for a longer period of time each season. There's a series of dams that keep the smaller warm section separate from the natural cold side.

https://en.wikipedia.org/wiki/North_Anna_Nuclear_Generating_...

https://www.virginialakehouses.com/wordpress/wp-content/uplo...


I imagine the additional costs of cleaning/replacing tubing when contaminants accumulate from less-than-potable water outweigh the benefits (from the datacenter's perspective).

Why not used a closed loop system? Maybe something with geothermal cooling.

Presumably because it’s cheaper to use evaporative cooling

For data centers, let's use prime retail real estate at a freeway exit: https://www.google.com/maps/place/Loudoun+County,+VA/@38.999... no where near where others might have a better understanding to put industrial property like the power company: https://www.google.com/maps/place/Loudoun+County,+VA/@39.208... wonder what consultant won that job.

It makes more sense if you need to defend the location in wartime. It also might not be just a datacenter even if there is one above ground. Many innocent looking features have strategic military value. Another easy one is the random long straight sections of US Interstates or Mt Weather Emergency Operations Center.

Scroll to the northwest a bit and you can see that you can ride Metro straight to the data center! How convenient!

Remember to bring lipstick as the trashed empty lots nearby serve as a great wild boar attraction to kiss and ride.



Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: