First time I hear corrosion is a problem in datacentres. I never noticed any corrosion on any machine at home or work. Does it have to do with the cooling of datacentres?
For example, I don't think I've ever had a hard drive fail in ~30 years in computing, well without being dropped that is, yet look at the Backblaze reports:
Admittedly my own experience is probably a bit skewed since I don't have a personal laptop. By the time I got a job that provided me w/ a laptop solid state drives were already quite mature. In my experience doing repair work: laptops were a huge source of our failed drive & data recovery operations.
: I should note these drives all operate nearly 24/7, no five-nines on my personal equipment though ;-P
They even had the audacity to reject my warranty claim.
In the last couple years I've had several HDDs fail starting with a 2TB Seagate (one of the "stars" in the Backblaze stats.) I've had others fail since, mostly multi-TB drives. I had one nearly ten year old 200GB drive begin to report remapped sectors. I've even had an SSD develop an unrecoverable error.
I didn't believe him either. I was like "Dad, SSDs are designed to fail in a way you can recover data. What SATA cable are you using? Is the port dead?"
Turns out it was literally the drive.
Intel had a strategy that caused them to go into read only mode once they reached their rated lifetime write capacity. All data remained available until they were power cycled whereupon they would intentionally brick themselves.
In my case it is was a Crucial M4. It still tries to operate but will fail the long SMART self test.
I've been very wary of SSDs since then.
I tried putting one of those in the freezer once, and maybe it was a folk remedy, but it worked enough to boot and get my data off!
What freezing the drive may achieve is either un-sticking it from the bearings (see also the "bang drive on edge" technique), or lowering the thermal noise floor in the electronics enough for marginal components to make it through the boot sequence.
It's not recommended. https://www.gillware.com/blog/data-recovery/hard-drive-freez...
In fact, no matter how long I left it in the freezer, I couldn't get all the data off. I ended up freezing it in the freezer, then putting it in a small cooler with a couple ice packs, with the wires hanging out between the cooler and lid. That gave me just enough to grab all the data (well, really, a couple of not-so-small VHDX files).
It was much fun, like a science experiment in high school.
That said, that paints a pretty handy real world picture of what one might expect.
I (on a Mac) was unaffected (and no doubt a little smug).
Data centers used to be like walk-in refrigerators. Now the air is borderline uncomfortably warm and slightly "heavy" with humidity.
Failure rate is a bit higher, but they are confident that their architecture is HA/redundant and of what they told me, it's cheaper to replace hardware (they got some DELLs "fresh-air" servers) than to cool the DC.
Seems like its a combination of cooling data centres (flowing air) and the quality of the air being circulated. So if there are pollutants in the air they pose a danger to the circuitry.
Never occurred to me this was yet another thing to be wary of when running a data centre!
It's probably the "sea" environment that would pose corrosion issues.
If you look at some of the older electronic test equipment which was made with much higher standard materials it's not uncommon to find something that has been in a damp shed for three decades and powers up just fine after the dead spiders have been removed and the mould cleaned off. BUT at the time of manufacture they cost more than a mid-range car.
I'm now close to the Pacific. No issues whatsoever.
But while the older equipment was pretty stable (so long as the capacitors didn't bulge), today's equipment usually has to comply with ROHS. Lead, for all its faults, has a very well-understood corrosion mechanic. The exotic blends used to replace it, we're discovering, aren't always so great...
You know, now that I think about it, I wonder how many cell phone warranty claims have been denied for water ingress were actually due to poor solder choices? Color-changing stickers aside, of course.
tldr: gaseous contaminants corrode silver/copper components of circuitry
Meh. The biggest data center (ChinaTel's Inner Mongolia Information Park) consumes about 150MWe, assuming all of it becomes waste heat it's basically nothing: a nuclear reactor (each plant has 2~8) releases 2Wt for each We it produces, and they're usually sized between 800 and 1300MWe.
Hell, earth averages 160W/m^2 from the sun, oceans cover 360m km^2, so oceans get 10^16 watts from the sun.
Not to say that it can't have a significant local effect, nuclear plants are strongly regulated to avoid heating their rivers too much (especially in warm summers), and again we're dealing with orders of magnitude more heat dumping heat waste in a very finite (though moving) amount of water.
"Experimental underwater data centres could be more sustainable if connected to offshore wind power, but Microsoft must focus more on investing in new renewable energy now. [...]"
Seems a shame to let all that heat go to waste. Surely we put that waste heat to use and a distillery or desalinization plant or something.
So combination data center, desalinization plant, and pickle factory then?
This gif: https://www.ucsusa.org/sites/default/files/images/2015/08/np... shows how typical nuclear reactors work. They're functionally just steam turbines, the nuclear power is just capable of heating a whole lot of water.
The waste heat, though, doesn't come in contact with the radioactive parts. That's what's happened on the bottom with the condenser.
My prediction is we'll be amazed at the life forms that develop and explode around such submerged cooling structures.
Ocean warming is extremely detrimental to ecosystems across the globe. You can't just simplify it to more heat == more energy == better
>In other non-tropical regions of the world, marine heat waves have been responsible for widespread loss of habitat-forming species, their distribution and the structure of their communities. This has a tremendous economic impact on seafood industries that have to deal with the decline of essential fishery species.
>It is likely that over the past century, the impacts on ecological chains have been more frequent as ocean temperatures have escalated decade after decade. This is the case in Nova Scotia, where kelp forests are literally being wiped-out by water which is much warmer than usual. In this corner of Canada, the ocean is not just a form of recreation, it also means a way of life for many that rely on fisheries and aquaculture as an important part of their economy.
That's cute, but 1. nature will adapt just fine to both heat and cold so that's not exactly compelling; 2. the issue is we may not, human civilisations have arisen in fairly specific conditions, and tend not to be very resilient to significant environmental changes
> My prediction is we'll be amazed at the life forms that develop and explode around such submerged cooling structures.
My prediction is we won't live long enough to see that happen, it takes kilo- to mega-years for anything more complex than bacterial mats to evolve to use new sources of anything.
Not with the same ease! That's the point — it is a one-way street. "Having easy access to energy" or "not having easy access to energy" are NOT equivalent states for flourishment. They're not equally "just fine".
The rest seems like you're grinding some anthropomorphic axe unrelated to my post, so I'll abstain.
Life needs an energy gradient. In this case, direct access to colder water. No organism (or machine) can use the heat energy of its environment if it has no access to a colder medium.
Edit: I just saw that Retric explained it better (https://news.ycombinator.com/item?id=17245948).
It looks like my original comment hit some HN ideological hot spot (unintentionally), but it's entirely uncontroversial scientifically.
If a fish is swimming in 24°C water, it can't simultaneously be swimming in 19°C water. Maybe its friend 10 meters away is swimming in 19°C water, but that doesn't help the first fish.
Maybe that fish feels more comfortable in 24°C water, because it needs a certain body temperature to keep its internal processes running (i.e., to not freeze to death), but it cannot harvest energy from the 24°C water, which is what you claimed above. I'm not nitpicking, this is one of the most fundamental and important laws of physics.
This is probably more relevant at temperature gradients greater than 5C, but it's thermodynamically possible.
In this context, we're not worried about the literal definition of natural but about keeping an environment in a state in which humanity can survive.
Life wants entropy like sunlight or glucose becase it can do something with it. Heat can speed up chemical reactions which makes a minimal amount useful indirectly but not as an energy source.
PS: Heat gradients are a form of entropy and for example create thermals which are then useful.
I was actually pondering whether to expand on "energy", "water", "carbon" and "life" in the original post (none of them trivial concepts) but decided against it. It'd only muddy the waters, so to speak, missing the point:
A submerged data centre will be a net boon to the biological life around it.
Why would that be the case? If this were the case, the biological density of the highest-temperature locations in the ocean should be significantly higher than average. This has not been observed. Quite the opposite, actually.
Secondly, in thermodynamic terms, heat is the least useful form of energy. Heat is often the waste product of a chemical/mechanical process and cannot be easily be converted into other forms of energy without significant loss.
And no offense, but water and carbon are both trivial well defined concepts. Energy is also fairly well defined.
I agree making use of subtle energy gradients is not trivial, but life is pretty good at it nevertheless. Even in conditions you wouldn't expect it. And no surprise — it had billions of years to evolve that way.
If you wanted to be daring, you could even say that's what life is for.
That's absolutely not what the comment you're replying to says. What it says is:
> If this were the case, the biological density of the highest-temperature locations in the ocean should be significantly higher than average.
> Thermals mentioned by OP are just one obvious example.
Thermals are not just heat, and by and large the heat is not a source of energy (sulphur chemistry is the basal energy source of thermals). And shallow waters have much higher biological densities.
Ambient heat is only useful so far as helping the organism improve the efficiency of its chemical and biological reactions, it's extremely rare for it to be an actual energy source (because as you've been told multiple times it's extremely hard to use/harvest). And organisms are generally adapted to a certain level of ambient heat with compensatory mechanisms matching, most don't do very well if you drastically change their ambient heat levels, again aside from micro-organisms with short lifecycle which can adapt extremely quickly.
You disagree, giving reasons I find irrelevant here (a data centre won't make a dent in the average ocean temperature, and certainly won't make it "the highest-temperature location in the ocean"), but I respect that. The good news is that the impact will be easy to evaluate once deployed.
In fact, testing the data centre's impact on the surrounding ecosystem will surely be a mandatory component of any such project, so we'll get to see the hard data. Let evidence be the judge of the "absolutely nots".
Is it to steadfastly argue positions far outside your domain expertise or engage in discourse and learn?
Of course you could argue that the lower cost of this cooling method will create a larger demand for cheaper servers/data storage, which increases net harm, for which I have no answer.
That argument doensn't apply here: demand isn't currently being constrained by us being affraid of the environmental impact of datacenters.
Conversely, if it became literally free to operate, you can imagine we'd have a higher demand for data, storage, etc.
It's like how center pivot irrigation reduced the water usage rate per acre but increased demand for water usage in various areas because costs were lowered as well.
That's not the case if you're comparing with AC. You have to use energy (generating more waste heat) to pump heat up a temperature gradient.
Find a desert under the sea and drop them all there. Some place that's basically water and dirt. It's probably going to be large enough that you could dump all of humanity's data centers there for the next 100 years and still have room to spare.
It's rather easy to calculate because one calorie is the energy required to heat up one gramme of water by one degree Celsius, the rest is just unit conversions, but I could have messed up anywhere of course.
Even if I'm off by a few orders of magnitude I'm confident that datacenters won't be noticeably heating up the sea for a long time.
I'm not saying there is a problem with your calculations, but if you were just one order of magnitude off, then after 10 years that would be 1°C, which is a colossal amount. The main problem with global warming, as I understand it, is that 1 or 2 degrees change to the atmosphere would melt a huge amount of polar ice. I imagine that if the sea were increase in temperature by that amount, given that water has better thermal conductivity than air and that's kinda where the polar ice mostly is, the effects would be at least as bad (?)
Edit: given that some of the heat would dissipate into the atmosphere and sea bed, maybe it would need to be more than one order of magnitude higher to have this effect.
That is assuming that all the electricity produced on Earth is used to heat the oceans, which is not realistic. Oceanic datacenters are not likely to ever amount to more than a few percent (and even that is unlikely) of the total human electric consumption.
For reference, apparently datacenters used 416 TWh in 2015, which is 0,002% of the total electricity usage.
That's why I'm confident that there is a lot of margin in my calculation.
Oops, I misread your comment, sorry. I thought you were talking about all electricity used in data centres worldwide.
or 1000 years if off in the other direction
$ units -t '24816400 GWh / (1347000000 km^3) / (1 kg/dm^3) / (4200 J/kg/K)'
$ units -t '24816400 GWh / (1347000000 km^3) / waterdensity / water_specificheat'
Since that's the same order of magnitude as, say, a US household, that does seem credible.
You're not putting all of the planet's datacenters within 5km^3 either. The calculation is fine.
Your calculation is stupid because you dont need to heat up whole ocean, heating up small areas it will be butterfly effect, enough to badly affect bigger areas. Look at this:
>As the concentration of carbon dioxide in the atmosphere rises due to emissions of fossil fuels, more of the gas is dissolving in the ocean. That is lowering the pH of the water, or making it more acidic, which can make it harder for reef-building organisms to construct their hard skeletons.
Minor change in CO2 changed pH of water, which kills organism in wider area.
There are also closed seas like Baltic that need over 100 years to fully mix sea water with ocean water, it's much less salty than other seas and warmer, also much more polluted from toxins that were sinked there during and after WW2.
No need to be rude. "There is an issue with your calculation because" would have worked :/
Then there's natural geothermal vents doing that, but probably an order of magnitude more, since before humans were around.
The ocean is a pretty big thermal mass.
Take the mediterranean for example, a relatively small ocean. It has a volume of 3.75e15m3, with a mass of 3.75e18kg.
We need 4kJ/kg/K to heat up water. To heat the whole mediterranean by 1K, we need an energy of 1.5e19kJ, or 4.16e6 TWh.
In 2008, total worldwide primary energy consumption was 1.32e5 TWh, 31x less energy than needed to heat the mediterranean by 1K.
I doubt it will become popular because it involves waterproofing the enclosures, complicates maintenance staff access, and also carries the risk of water getting in and damaging the equipment.
There are other alternatives to AC. Using a cooling tower is a much better alternative to submerging the whole datacenter.
>> "Microsoft's Ben Cutler insists the warming effect will be minimal - "the water just metres downstream would get a few thousandths of a degree warmer at most" - and the overall environmental impact of the Orkney data centre will be positive."
It's not like we've heard that before about carbon dioxide emissions and other environmental pollution. Companies have mostly a different interests than the protection of our environment, from past experience.
I'm very interested to see what scientists and researchers think of this.
This is more a lot more efficient.
Perhaps there might be a localised effect, but I doubt the effects would be as severe as the carbon emissions saved.
Makes as much sense as the people who think that windfarms will eventually stop the wind, or people who worry that smoking is contributing to global warming.
But since the goal is to cool the datacenter, we want the temperature difference to be as small as possible.
Reclaiming energy from waste heat usually only pays off in industrial settings. If you have very hot steam, you can use it to power a steam turbine and generate electricity. But at lower temperatures, the efficiency is too low.
> "You just end up with a warmer sea and bigger fish," he says.
> And 90% of Europe's data centres are in big cities such as London and Madrid because that is where they are needed.
> Microsoft's Ben Cutler insists the warming effect will be minimal - "the water just metres downstream would get a few thousandths of a degree warmer at most" - and the overall environmental impact of the Orkney data centre will be positive.
Both of these opinions feel like exaggerations but I don't know enough about thermodynamics to know what the true answer is. I have a feeling that the Microsoft opinion is much closer to the truth, can anyone help me understand how I'd walk through the numbers?
What I'm more unclear on is how small the impact is. It seems very likely to me that throwing the waste heat into the bottom of the ocean has less of an environmental impact than running A/C to suck out the heat and dump it into the air.
However, is the impact really so small that you wouldn't notice it just meters away? That's the part I'm unclear on.
But the efficiency gains are such: Air conditioning is actually inefficient, especially with air cooling. Water cooling is more efficient. Water cooling traditionally uses water as an exchange medium, but is eventually water-cooled-by-air anyway, just in bigger batches. Here, they can take in new cold water and throw out old hot water without actually bothering with any air exchange at all. Or at least, that's the plan. Maybe it'll work!
(If "nobody cares because energy is cheap", we wouldn't have tried this in the first place.)
That's a false dichotomy. More energy-efficient CPUs don't preclude efficiencies from cooling.
I suppose the case could be made that CPUs account for the vast majority of datacenter cooling needs and that ARM efficiencies would eliminate so much of that need that any efficiences in the cooling itself would not be worth anywhere near this kind of cost. I'd expect some pretty extraordinary evidence backing up that argument, since those would be pretty extraordinary claims.
There is precedent. In Seattle there's the old "steam plant" which piped steam to many local buildings to heat them in winter.
About ten years ago, before working at Amazon, my last job involved building a datacenter in Leiden, the Netherlands. There the city has a municipal heat exchange program and we could also vent the excess heat to be used for heating water.
Modern data centers, especially for Cloud services, are really really really big though ... so big that they have specialized real estate and power requirements. The locations where you can get that much power, and that much space, tend to be outer sub-urban or quite remote. In those locations, there are few consumers for excess heat so more effort goes into reclaiming the energy loss through other means.
Denver's got the oldest continually operating system in the world, and within the last decade or so, they added a cooling loop, as well. Instead of a boiler and a cooling tower, you can subscribe to a steam loop and a chiller loop.
The problem is once again of gradient. It makes thermodynamic sense to pipe steam around, because of the large gradient between steam and ambient. But servers don't make steam. At best they make warm water only a couple of dozen degrees over ambient... and warm water doesn't have enough energy to heat buildings very effectively.
That said, and as someone already mentioned, people are doing it anyway. Seattle's internet exchange pipes its water across the street to the Amazon towers.
BTW, Seattle's Georgetown steam plant might not be making steam any more, but the one down by the market is still operating as Emwave Seattle.
Consider that geothermal heating systems are based on the ground having a temperature of 55 degrees. The difference between that and cooler ambient is used to drastically reduce heating bills.
For example, if it is 30 degrees outside, that is heated to 55 by the earth, then the building heater only has to boost the 55 to 70 rather than 30 to 70.
I'm assuming it has to do with distribution and reliability within regions, hence why you even need datacenters at different locations and not just one location.
I can't imagine submersion is significantly better for cooling, the energy density very likely already requires a circulating water system, and the added issues dealing with the pressure at depth seem risky.
Possibly there is a benefit not related to cooling. Perhaps weather/waves? Things are probably alot calmer 30ft below the surface. The tossing and turning on the surface may place additional streses on things like HDD spidles.
Each time you do that, it puts a slight load on the spindle bearings. Do it repeatedly on a varying 5-30 second cycle and you'll simulate what a harddrive on a boat or barge in the open water would experience.
I can imagine that would create additional wear and tear and contribute to an increased failure rate.
And here's the project site: http://natick.research.microsoft.com/
Thus there are kind of two cases to solve for. You can stick a data center in Northern Europe, and that heat might be valuable enough that your approach could be to try to reclaim it. If you stick a data center in Singapore, you'd better focus on generating less heat in the first place or finding better ways to get rid of it.
> Optimal intermediate trading node locations for all pairs of 52 major securities exchanges, calculated using Eq. 9 as midpoints weighted by turnover velocity... While some nodes are in regions with dense fiber-optic networks, many others are in the ocean or other sparsely connected regions.
For example, if you had a underwater data center that sat in the middle of the atlantic ocean in between new york and london, you could do some serious trading with that capability.
You can note, for example, that there is no data centre half way between Chicago and New York, even though that area has cheap accessible real estate and billions of dollars have been spent on low latency communication links between the two.
Why spend several million dollars in leasing space when you can drop a datacenter capsule off the coast with free cooling? Who cares about the cost of hardware.
You’re going to pay even if it doesn’t have a street address.
Of course, Orkney was part of Norway before becoming part of Scotland in 1468.
My cousin, from Indiana, is currently serving as a minister on Stronsay. The scenic photos he's sharing are certainly encouraging me to contemplate a visit.
Edit: (Can't work out if was great great grandfather or not! - but certainly some ancestor!)
Edit2: I'd definitely recommend visiting Orkney - it regularly gets evaluated as one of the nicest bits of the UK to live, the scenery is fantastic and it has some of the most amazing historical artifacts in Europe.
After that, I'm not so sure. Perhaps a slow, continuous movement of an exchange surface past a hard surface would do the trick. When this stuff is young, it's easily wiped away, but once it's there for a while you get real problems.
But what happens after 5 years (the expected life of the datacenter). Most of the computers in there will be worthless then. Will they bring up the datacenter and reload it with new equipment? Will the cost-benefit be in favor of re-equipment or just sink a new one in? If its going to be cheaper to just drop a new one in, we will have ocean floor littered with dead datacenters.
Haven't heard anything about this.
Even more advanced concept, combined district heating and cooling. With combined district heating and cooling and integrate data centers into them. Data center heat can be used in cold regions for heating. Combined heat pump/chiller units can produce both heat and cooling at the same time. Sea water (trough absorption chillers) is used for cooling apartments, offices and data centers. Heat from data centers, purified waste water, etc. is used for heating apartments. During winter more heat is utilized. During summers more water cooling is utilize.
Suvilahti Data Centre New Build Case Study: Operationalizing District Heating and Cooling in large Data Centre in Former Electricity Station http://www.energy-cities.eu/IMG/pdf/WS2_Helsinki.pdf
COMBINED DISTRICT HEATING AND COOLING
District Heating & Cooling in Helsinki
I'm quite confident that even making optimal use of the excess heat you would end up with less money than with their solution. I don't think companies want maximum energy efficiency. They want maximum cost efficiency.
> physical security etc.
Physical security is not a problem. It's not like the city utilities manage the cooling of the data center. There are heat exchangers between. If there is problem with city utilities, the heated sea water goes back to the sea.
There's also plenty of coastline away from major cities. Find enough open space and it becomes feasibly multi-tenant. Add wind/solar and run a cable (or 3).
https://www.wired.com/2012/01/google-finland/ there should be a video with the details here.
I'll give you the constant temperature but this project is about using the nearly-free ocean to maintain temperature so surely the delta isn't too large here.
I don't see how this is easier than a boat at anchor.
In reality of course connectivity would be a major problem and energy price differences are probably not large enough to make it viable.
Power it with offshore windmills, and you really only need a network connection.
Great idea though, using water to cool your datacenter and leaving all the oxygen out.
The MSFT Datacenter is a vessel under law, if they don’t want to fly it under a flag they’ll have to essentially abandon it and then salvage laws come into play.
In a more fun world, one of those vessels would fly the Jolly Rogers and host a certain infamous Swedish bay.
As for “without laws” google still follows European laws even if my search happens to be handled on an American or British machine. You obey the laws where you operate not where your server is located.
But if the economics in the future make sense on some level, I could see criminal enterprises interested in exploring things like this. If narco's are building/using submarines now, I could see them managing/tracking/monitoring a global logistics from their sunken data centers (they'll need to find other ways to power and transmit/receive, because I doubt they could just hook up their cables directly onshore anywhere, setting aside being an easy choke-point to cut access).
Maybe future narcos, illegal EU personal data miners/brokers, pirates or who else for whatever could be interested, esp if the fixed cost of operating stuff now/future > operating/ future sunken data centers and probability/cost of seizure now/future > probability/cost with future sunken datacenters?
For example: industrial, unregulated fishing destroys coral reefs every day. They are thousands of years old and won't come back.
This is the part that concerns me the most. Pretty much all DCs have multiple 24/7 staff to deal with hardware failures and equipment swaps... telling a client "you can't access your hardware for 5 years" wouldn't go over too well.
In that model, hardware failing is just a factor in total overhead cost. If the hardware doesn’t fail immediately it might be cheaper to leave a dead node in the rack than to pay a human to touch it, especially if they’ve already recouped a significant percentage of the purchase cost by the time it fails. Over the life of a server the cost of cooling is enough that a substantial savings will push that breakeven point earlier.
How much is that in Library of Congress units? This trend of not giving out data but appearing to do so is strange.
Assuming you're using 15TB as one LoC unit, then about 1,800 LoCs.
In theory you'd only leave it down there for three years anyway before everything in it is worth zero, at least to the IRS.
That kind of thing eats directly into the ROI for a datacenter. I doubt it competes with a static building with a bunch of solar panels on top.
> There has been growing concern that the rapid expansion of the data centre industry could mean an explosion in energy use. But Emma Fryer, who represents the sector at Tech UK, says that fear has been exaggerated.
> "What's happened is we've had the benefit of Moore's Law. The continued advances in processing power have made doom-laden predictions look foolish"
There may be other reasons that energy efficiency will continue to improve, but Moore's law, (More specifically in relation to performance per watt: Dennard Scaling), has long been at an end. Given her position it's fairly ignorant to sight this as a reason for a continued lower proportion of energy consumption growth by now.
But I guess we won’t know until we try!
Yet I assume this isn't particularly useful for a datacentre of a well understood shape that doesn't intend to use the conditions to create power through novel designs.
There's a proposed upgrade to link to the grid at the nearby Dounreay nuclear power station: https://www.ssepd.co.uk/OrkneyCaithness/