Hacker News new | comments | show | ask | jobs | submit login
Microsoft has sunk a data centre in the sea to investigate energy efficiency (bbc.com)
512 points by Qworg 5 months ago | hide | past | web | favorite | 376 comments



> Additionally because there are no people, we can take all the oxygen and most of the water vapour out of the atmosphere which reduces corrosion, which is a significant problem in data centres

First time I hear corrosion is a problem in datacentres. I never noticed any corrosion on any machine at home or work. Does it have to do with the cooling of datacentres?


I think part of it is that even problems that are small/unnoticeable at the individual level become big when multiplied by a huge N.

For example, I don't think I've ever had a hard drive fail in ~30 years in computing, well without being dropped that is, yet look at the Backblaze reports:

https://www.backblaze.com/blog/hard-drive-benchmark-stats-20...


That's just unbelievable. To even survive the Deskstar era blows my mind. Not a single drive?


That's not all that surprising to me, I've had supremely good fortune with drives. I have a Maxtor Fireball that still "works" (though its SMART has failed.) My current workstation still has drives in it from the three prior workstations; spanning back 7 or 8 years now[1], humming along well outside their warrantied operation. The first disk failure(s) I witnessed came while I worked at a computer repair shop, which would be "a huge N" compared to the number of PCs I've personally owned. My first disk failure on my own personal equipment didn't happen until I built out a home storage array - again a rather large multiplier of number of disks compared to a traditional PC. Even then that failure was because a disk controller got fried, the drive was mechanically sound and the data was completely recoverable.

Admittedly my own experience is probably a bit skewed since I don't have a personal laptop. By the time I got a job that provided me w/ a laptop solid state drives were already quite mature. In my experience doing repair work: laptops were a huge source of our failed drive & data recovery operations.

[1]: I should note these drives all operate nearly 24/7, no five-nines on my personal equipment though ;-P


Argh trigger alert, I bought four of those in a raid 10.

They even had the audacity to reject my warranty claim.


They sent me back refurbs that didn't work, I just gave up.


Up until a couple years ago I could have said the same. I even have a couple deathstars on the shelf that were operational when retired.

In the last couple years I've had several HDDs fail starting with a 2TB Seagate (one of the "stars" in the Backblaze stats.) I've had others fail since, mostly multi-TB drives. I had one nearly ten year old 200GB drive begin to report remapped sectors. I've even had an SSD develop an unrecoverable error.


My dad managed to kill a 2 year old Samsung SSD; it wouldn't even show up on the bios. Tech support was so incredulous they actually had an Electrical Engineer on their staff call him.

I didn't believe him either. I was like "Dad, SSDs are designed to fail in a way you can recover data. What SATA cable are you using? Is the port dead?"

Turns out it was literally the drive.


I recall hearing of similar failures early on. I don't recall the brand. Apparently the controller goes bad and it just stops responding to SATA commands.

Intel had a strategy that caused them to go into read only mode once they reached their rated lifetime write capacity. All data remained available until they were power cycled whereupon they would intentionally brick themselves.

In my case it is was a Crucial M4. It still tries to operate but will fail the long SMART self test.


Much like a poster above, I've got a walking graveyard of previous platter drives in my workstation from various old workstations and they're all fine, but (and I think mine was a Samsung too), my first experiment with an SSD ended in tears. It was fine for similarly about two years, then I had some weird troubles booting about 3 or 4 times, then it just disappeared completely.

I've been very wary of SSDs since then.


I have. Including things like the heads falling off and spinning around, grinding everything to junk.


That sickening click sound that a disk drive makes when it's on the verge of failure is ingrained in my head.

I tried putting one of those in the freezer once, and maybe it was a folk remedy, but it worked enough to boot and get my data off!


That must be one of the silliest hacks I've ever heard, right up there with oven-baking electronics to fix loose solder joints. I love it!


As a teen I managed to 'rescue' 1000's of £/$'s worth of graphics cards from eBay listed as spares/repair with this method. I recall this working about half the time; then again at least 3/10 cards from eBay listed as broken worked immediately when plugged in.


Did you have any "signals" you looked for on the listings to indicate they might just need an oven reflow, or did you just buy everything cheap and hope it worked?


Reminds me of the old Xbox 360 "red ring of death" fix where you could just wrap the box in towels and leave it running for an hour to overheat it, then reboot and it would work again.


Very recently I was able to get a bootlooping N5X to temporarily boot, long enough to recover pictures, by putting it in to the freezer for a couple of hours. Some folks have (claimed to have) had long term success from freezing the bootlooping devices.


Not OP, but I have had the freezer trick work more times than not. It's a last case type of deal, but when there's nothing to lose try it. Just make sure the drive is sealed up nicely.


Worked in HS for a local computer repair shop. One trick to improve success with the freezer method is to place the drive in a sealed plastic bag with some desiccant packs. Placing a bare drive in the freezer can kill it due to condensation. Leave the drive in the bag at least a few hours to give the desiccant time to work. Like others have said, this is a last ditch effort, and if you really care enough, don't do it if you are willing to send it to a specialist, since it can ruin a recoverable drive. But it does seem to work ~30% of the time (at least w/ deathstars) and I did have success on a clicking Seagate drive last year.


It works because of how hard drives work - the heads are extended by slightly heating the bar the heads are connected to. If you're head crashing, you need a little less length, hence freezer.


This isn't quite true, the heads are kept from crashing by air pressure and a crash does irreversible damage. The "clicking" sound is either the heads flicking back and forth as the drive tries to seek, or the central motor trying and failing to start platters spinning which are jammed on the bearing.

What freezing the drive may achieve is either un-sticking it from the bearings (see also the "bang drive on edge" technique), or lowering the thermal noise floor in the electronics enough for marginal components to make it through the boot sequence.

It's not recommended. https://www.gillware.com/blog/data-recovery/hard-drive-freez...


When I worked for a small computer place years and years ago we'd sometimes "tap" a drive as it was spinning up if freezing it didn't work. Tapping it with a screw driver handle rarely worked, but it was always a last ditch effort to save having to send a drive out -- which most people wouldn't pay for anyway.


I did that with a drive in my computer fairly recently. Although I didn't so much tap it as shake it on startup. It did finally start spinning enough to insure that I had the last few bits of data transferred to a new drive, though. :)


While they may make legitimate points in the article, it's hard to consider someone that makes money on HDD recovery like the authors of the article having an unbiased viewpoint here.


Ah, interesting! Thanks for the correction. I was off the project before we put in the drives obviously. ;)


There was a type of Seagate drive long ago (I believe mine was a 2GB) that would stop working, apparently because the spindle got stuck. But if you shook it in a rotational manner, it would unstick! The first time I sent it for warranty service and of course they deleted all my data. But the next time it happened I learned about the trick.


I can confirm the freezer trick worked too. This was back before multi-platter high density drives though. I tried it recently with a failed 6TB drives from a friend, and it didn't work.


Yeah this was many years ago. I think the one I did it with was a 250GB from the mid to late 2000's


It definitely works. I just did it a couple of years ago.

In fact, no matter how long I left it in the freezer, I couldn't get all the data off. I ended up freezing it in the freezer, then putting it in a small cooler with a couple ice packs, with the wires hanging out between the cooler and lid. That gave me just enough to grab all the data (well, really, a couple of not-so-small VHDX files).

It was much fun, like a science experiment in high school.


You may be interested in this interesting archive of hdd-failing-sounds. Note: as it's about 10 years old, it requires flash. http://datacent.com/hard_drive_sounds.php


Well, you're exactly part of a huge N of people that appear on the internet, who combined over the years owned an even larger number of drives.


Even managing just a few hundred nodes, I see hard drive failures pretty routinely. Servers hammering on storage devices 24x7 are much different from desktop computers. I've never had a hard drive fail on my own desktop machine, though I know people who have.


That backblaze report is absolutely fascinating! I'm curious though what the statistics would be from a more consumer point of view, with regards to quantity of use, operational temperature and so forth. I imagine most consumer drives sit idle almost all of the time, whereas theirs are more likely in constant use.

That said, that paints a pretty handy real world picture of what one might expect.


I killed a hard drive when I swapped motherboards about 15 years ago. I suspect static electricity.


My guess is that you change disks often. I've had multiple fails over the years, either from overheating or more often from overuse for extended periods of time (aka more than five years). Although I've never had a failure before the guarantee expired.


Yesterday I lost 3TB Toshiba TD01* series drive 23,75 months in (home) use. Only 4800 hours powered on. And it started to seek fail, even SMART showed it. It failed one week before warranty ended. Ha!


I've seen a whole batch of hard drives fail: one-by-one the company's employees' PCs failed over a period of a week or two. They had all been bought together in a single upgrade.

I (on a Mac) was unaffected (and no doubt a little smug).


The only hard drive I've broken was because a let a pretty powerful magnet fall on the computer, right on the spot the HDD is. Never play with magnets next to your computer.


My N is just 10, yet I've had two failures :(


A trend in energy-efficient data centers is to maintain temperatures and humidity at near the maximum acceptable levels to reduce energy use for cooling and dehumidification.

Data centers used to be like walk-in refrigerators. Now the air is borderline uncomfortably warm and slightly "heavy" with humidity.


Excatly. I know one which only does free-cooling (they only have big fans; that's it).

Failure rate is a bit higher, but they are confident that their architecture is HA/redundant and of what they told me, it's cheaper to replace hardware (they got some DELLs "fresh-air" servers) than to cool the DC.


Sounds like underground in the desert would be suitable?


The industrial scale cooling used in datacentres can produce weird results, like when it rained inside Facebook's first datacentre: https://www.theregister.co.uk/2013/06/08/facebook_cloud_vers...


Sounds like tehy didnt know about this building [1]. That happened like 50ish years ago

1: https://roadtrippers.com/stories/boeing-factory


I wondered about this as well : https://www.researchgate.net/profile/Michael_Schappert/publi...

Seems like its a combination of cooling data centres (flowing air) and the quality of the air being circulated. So if there are pollutants in the air they pose a danger to the circuitry.

Never occurred to me this was yet another thing to be wary of when running a data centre!


I had a PC I built and moved to a bungalow near the beach. Within the year it's power supply had rusted. My last PC in a relatively arid environment lasted 4 years.

It's probably the "sea" environment that would pose corrosion issues.


Power supplies are usually plated or coated recycled "chinesium" steel. This is because it's incredibly cheap and has a half decent EMC outcome for the price. One little nick in the coating or plating and that's it though. Really they could use other materials such as cast aluminium, brass or copper but they are orders of magnitude more expensive.

If you look at some of the older electronic test equipment which was made with much higher standard materials it's not uncommon to find something that has been in a damp shed for three decades and powers up just fine after the dead spiders have been removed and the mould cleaned off. BUT at the time of manufacture they cost more than a mid-range car.


In an interview on Radio Scotland, the project lead said the container is sealed with an inert gas replacing the air and with a significantly lower water vapour content than the atmosphere.


There is a stark difference depending on which sea you are located at. I used to live close to the Atlantic ocean, near the equator. The failure rate of electronic components was through the roof. Random memory failures (due to contaminants in memory contacts) as well as corrosion in copper traces in the PCBs. At least one instance of motherboard fire.

I'm now close to the Pacific. No issues whatsoever.


Yes. The air quality -- especially the amount of sulfur in the air -- contributes to sulfur creep corrosion. Run that through an image search to see the results. It's pretty dramatic. Some of this equipment looks like it had been underwater.

But while the older equipment was pretty stable (so long as the capacitors didn't bulge), today's equipment usually has to comply with ROHS. Lead, for all its faults, has a very well-understood corrosion mechanic. The exotic blends used to replace it, we're discovering, aren't always so great...

You know, now that I think about it, I wonder how many cell phone warranty claims have been denied for water ingress were actually due to poor solder choices? Color-changing stickers aside, of course.


Here's an interesting piece:

http://computer.expressbpd.com/news/ctrls-shows-how-data-cen...

tldr: gaseous contaminants corrode silver/copper components of circuitry


I think it's mainly a problem in highly polluted urban areas because of sulphur dioxide in the air. Most datacenters use enough air conditioning for moisture to be a non-issue.


Some AWS techs in Oregon from a few years ago have a fun story for you.


Maybe it is related to mitigation of sea water facilitated corrosion?


I think it has to do with the humidity of the air.


At what point does this become a problem? They're using the naturally cool sea water to regulate the temperature in the container, but that heat needs to go somewhere... it heats the water. If this becomes a popular alternative to classic server farms, at what point does it start to increase the temperature of the sea? I know, the oceans are big, but so is our data.


> I know, the oceans are big, but so is our data.

Meh. The biggest data center (ChinaTel's Inner Mongolia Information Park) consumes about 150MWe, assuming all of it becomes waste heat it's basically nothing: a nuclear reactor (each plant has 2~8) releases 2Wt for each We it produces, and they're usually sized between 800 and 1300MWe.

Hell, earth averages 160W/m^2 from the sun, oceans cover 360m km^2, so oceans get 10^16 watts from the sun.

Not to say that it can't have a significant local effect, nuclear plants are strongly regulated to avoid heating their rivers too much (especially in warm summers), and again we're dealing with orders of magnitude more heat dumping heat waste in a very finite (though moving) amount of water.


Even Greenpeace at least doesn't think that it's a bad idea:

"Experimental underwater data centres could be more sustainable if connected to offshore wind power, but Microsoft must focus more on investing in new renewable energy now. [...]"

https://www.bbc.com/news/technology-35472189


My understanding is that MS took that to heart on this second round. The article claimed they choose Orkney as the site of the new datacenter precisely because they have an excess of renewable energy.


That's kind of a clever idea to promote renewable energy.


Coupling an undersea datacenter with an offshore windfarm just sounds all-around clever.


Until a hurricane destroys it.


Build all above-surface structures out of inflatables. During heavy weather deflate and retract (5-10m). Use simple solid-state generators, e.g. https://en.wikipedia.org/wiki/Vaneless_ion_wind_generator


Are other structures safer against hurricanes?


An underground data center would work if hurricane safety is your top concern. In any case a location on land is probably cheaper to rebuild.


I think an eventual hurricane would not destroy the data center, only just the wind turbines.


What about earthquakes then?


>Meh. The biggest data center (ChinaTel's Inner Mongolia Information Park) consumes about 150MWe, assuming all of it becomes waste heat it's basically nothing: a nuclear reactor (each plant has 2~8) releases 2Wt for each We it produces, and they're usually sized between 800 and 1300MWe.

Seems a shame to let all that heat go to waste. Surely we put that waste heat to use and a distillery or desalinization plant or something.


Desalination isn't a bad idea, but it does produce large quantities of waste brine, among other things: https://en.wikipedia.org/wiki/Desalination#Environmental


>but it does produce large quantities of waste brine, among other things

So combination data center, desalinization plant, and pickle factory then?


To let all that radioactive heat go to waste? Although maybe that's just the primary coolant circuit - is the water they use from rivers/the ocean made radioactive, or just heated?


I was thinking more from data centers. You could probably do it with nuclear too.

This gif: https://www.ucsusa.org/sites/default/files/images/2015/08/np... shows how typical nuclear reactors work. They're functionally just steam turbines, the nuclear power is just capable of heating a whole lot of water.

The waste heat, though, doesn't come in contact with the radioactive parts. That's what's happened on the bottom with the condenser.


You say "assuming all of it becomes waste heat", but doesn't all of the energy consumed necessarily become waste heat no matter what?


In addition, the heat represents extra energy. Warmth and water (and carbon) are the very fabric of life. Nature will adapt to utilize this free energy source before you could say "thank you".

My prediction is we'll be amazed at the life forms that develop and explode around such submerged cooling structures.


> Warmth and water (and carbon) are the very fabric of life. Nature will adapt to utilize this free energy source before you could say "thank you"

Ocean warming is extremely detrimental to ecosystems across the globe. You can't just simplify it to more heat == more energy == better

>In other non-tropical regions of the world, marine heat waves have been responsible for widespread loss of habitat-forming species, their distribution and the structure of their communities. This has a tremendous economic impact on seafood industries that have to deal with the decline of essential fishery species.

>It is likely that over the past century, the impacts on ecological chains have been more frequent as ocean temperatures have escalated decade after decade. This is the case in Nova Scotia, where kelp forests are literally being wiped-out by water which is much warmer than usual. In this corner of Canada, the ocean is not just a form of recreation, it also means a way of life for many that rely on fisheries and aquaculture as an important part of their economy.[1]

[1]https://www.theweathernetwork.com/news/articles/ocean-heat-w...


> In addition, the heat represents extra energy. Warmth and water (and carbon) are the very fabric of life.

That's cute, but 1. nature will adapt just fine to both heat and cold so that's not exactly compelling; 2. the issue is we may not, human civilisations have arisen in fairly specific conditions, and tend not to be very resilient to significant environmental changes

> My prediction is we'll be amazed at the life forms that develop and explode around such submerged cooling structures.

My prediction is we won't live long enough to see that happen, it takes kilo- to mega-years for anything more complex than bacterial mats to evolve to use new sources of anything.


> nature will adapt just fine to both heat and cold

Not with the same ease! That's the point — it is a one-way street. "Having easy access to energy" or "not having easy access to energy" are NOT equivalent states for flourishment. They're not equally "just fine".

The rest seems like you're grinding some anthropomorphic axe unrelated to my post, so I'll abstain.


> "Having easy access to energy"

Life needs an energy gradient. In this case, direct access to colder water. No organism (or machine) can use the heat energy of its environment if it has no access to a colder medium.

Edit: I just saw that Retric explained it better (https://news.ycombinator.com/item?id=17245948).


Where is this confusion coming from? Are you suggesting there won't be colder water around the data centre? Or just nitpicking for the sake of it?

It looks like my original comment hit some HN ideological hot spot (unintentionally), but it's entirely uncontroversial scientifically.


> Where is this confusion coming from? Are you suggesting there won't be colder water around the data centre?

If a fish is swimming in 24°C water, it can't simultaneously be swimming in 19°C water. Maybe its friend 10 meters away is swimming in 19°C water, but that doesn't help the first fish.

Maybe that fish feels more comfortable in 24°C water, because it needs a certain body temperature to keep its internal processes running (i.e., to not freeze to death), but it cannot harvest energy from the 24°C water, which is what you claimed above. I'm not nitpicking, this is one of the most fundamental and important laws of physics.


I don't know whether this is feasible at the temperature gradients created by data centers, but it's not prohibited by physics. If there are chemical species in the water that are stable at 24 C and not 19 C, the fish can harvest these for chemical energy after they have traveled the ten meters.

This is probably more relevant at temperature gradients greater than 5C, but it's thermodynamically possible.


Nature, defined as the absence of human meddling, will be fine. Especially after humanity went extinct.

In this context, we're not worried about the literal definition of natural but about keeping an environment in a state in which humanity can survive.


Heat is not really useful energy the way your taking about.

https://en.m.wikipedia.org/wiki/Second_law_of_thermodynamics

Life wants entropy like sunlight or glucose becase it can do something with it. Heat can speed up chemical reactions which makes a minimal amount useful indirectly but not as an energy source.

PS: Heat gradients are a form of entropy and for example create thermals which are then useful.


You're correct of course, but same difference here. The extra energy will kickstart processes that nurture life.

I was actually pondering whether to expand on "energy", "water", "carbon" and "life" in the original post (none of them trivial concepts) but decided against it. It'd only muddy the waters, so to speak, missing the point:

A submerged data centre will be a net boon to the biological life around it.


> The extra energy will kickstart processes that nurture life.

Why would that be the case? If this were the case, the biological density of the highest-temperature locations in the ocean should be significantly higher than average. This has not been observed. Quite the opposite, actually.

Secondly, in thermodynamic terms, heat is the least useful form of energy. Heat is often the waste product of a chemical/mechanical process and cannot be easily be converted into other forms of energy without significant loss.

And no offense, but water and carbon are both trivial well defined concepts. Energy is also fairly well defined.


Local pockets of increased heat have not been observed to nurture life? Thermals mentioned by OP are just one obvious example.

I agree making use of subtle energy gradients is not trivial, but life is pretty good at it nevertheless. Even in conditions you wouldn't expect it. And no surprise — it had billions of years to evolve that way.

If you wanted to be daring, you could even say that's what life is for.


> Local pockets of increased heat have not been observed to nurture life?

That's absolutely not what the comment you're replying to says. What it says is:

> If this were the case, the biological density of the highest-temperature locations in the ocean should be significantly higher than average.

> Thermals mentioned by OP are just one obvious example.

Thermals are not just heat, and by and large the heat is not a source of energy (sulphur chemistry is the basal energy source of thermals). And shallow waters have much higher biological densities.

Ambient heat is only useful so far as helping the organism improve the efficiency of its chemical and biological reactions, it's extremely rare for it to be an actual energy source (because as you've been told multiple times it's extremely hard to use/harvest). And organisms are generally adapted to a certain level of ambient heat with compensatory mechanisms matching, most don't do very well if you drastically change their ambient heat levels, again aside from micro-organisms with short lifecycle which can adapt extremely quickly.


Listen, my claim is simple: submerged data centres will give rise to richer, more complex ecosystems around them (due to the extra energy, heat convection, increased entropy etc), in a surprisingly short timeframe. Because that's what life's good at.

You disagree, giving reasons I find irrelevant here (a data centre won't make a dent in the average ocean temperature, and certainly won't make it "the highest-temperature location in the ocean"), but I respect that. The good news is that the impact will be easy to evaluate once deployed.

In fact, testing the data centre's impact on the surrounding ecosystem will surely be a mandatory component of any such project, so we'll get to see the hard data. Let evidence be the judge of the "absolutely nots".


Ask yourself why you're on this site.

Is it to steadfastly argue positions far outside your domain expertise or engage in discourse and learn?


Well you're making it sound as if the data centers being cooled somehow wouldn't happen otherwise. Seeing as how the alternative is to cool using A/C, surely it's the case of causing less harm than the alternative?

Of course you could argue that the lower cost of this cooling method will create a larger demand for cheaper servers/data storage, which increases net harm, for which I have no answer.


> Of course you could argue that the lower cost of this cooling method will create a larger demand for cheaper servers/data storage, which increases net harm, for which I have no answer.

That argument doensn't apply here: demand isn't currently being constrained by us being affraid of the environmental impact of datacenters.


Sure it does. If the costs of these datacenter operations were extremely high (say, by a factor of 100), you bet we'd have lower bitrate for Youtube, fewer GB allowances on Office 365, Dropbox, Gmail, etc.

Conversely, if it became literally free to operate, you can imagine we'd have a higher demand for data, storage, etc.

It's like how center pivot irrigation reduced the water usage rate per acre but increased demand for water usage in various areas because costs were lowered as well.


edit: nevermind I can't read


You don't believe demand isn't being constrained in the way I described? So you believe that people are looking at cats on facebook, and stop because they're worried about the environmental impact of facebook's datacenters? You're joking...


FWIW, your local FB datacenter is likely powered by 100% renewable energy :)


There'll also be a lot less energy consumed in the cooling process - A/C is probably grossly inefficient compared to moving water around.


The overall heat increase is the same, but the localized effects mights not be: fewer beings live in the atmosphere, and air displaces that heat faster than water, reducing the local concentration.


>The overall heat increase is the same

That's not the case if you're comparing with AC. You have to use energy (generating more waste heat) to pump heat up a temperature gradient.


Wait, is that true? Fewer organisms in air versus water? Is that due to the huge amount of e.g. zooplankton maybe? Curious...


>which increases net harm, for which I have no answer.

Find a desert under the sea and drop them all there. Some place that's basically water and dirt. It's probably going to be large enough that you could dump all of humanity's data centers there for the next 100 years and still have room to spare.


If my calculation is correct, the total annual electricity production in the world (I used 20 279 PWh) would heat up the oceans (1 347 000 000 km³ of water) by 0,0129°C.

It's rather easy to calculate because one calorie is the energy required to heat up one gramme of water by one degree Celsius, the rest is just unit conversions, but I could have messed up anywhere of course.

Even if I'm off by a few orders of magnitude I'm confident that datacenters won't be noticeably heating up the sea for a long time.


> Even if I'm off by a few orders of magnitude

I'm not saying there is a problem with your calculations, but if you were just one order of magnitude off, then after 10 years that would be 1°C, which is a colossal amount. The main problem with global warming, as I understand it, is that 1 or 2 degrees change to the atmosphere would melt a huge amount of polar ice. I imagine that if the sea were increase in temperature by that amount, given that water has better thermal conductivity than air and that's kinda where the polar ice mostly is, the effects would be at least as bad (?)

Edit: given that some of the heat would dissipate into the atmosphere and sea bed, maybe it would need to be more than one order of magnitude higher to have this effect.


> I'm not saying there is a problem with your calculations, but if you were just one order of magnitude off, then after 10 years that would be 1°C, which is a colossal amount.

That is assuming that all the electricity produced on Earth is used to heat the oceans, which is not realistic. Oceanic datacenters are not likely to ever amount to more than a few percent (and even that is unlikely) of the total human electric consumption.

For reference, apparently datacenters used 416 TWh in 2015, which is 0,002% of the total electricity usage.

That's why I'm confident that there is a lot of margin in my calculation.


> all the electricity produced on Earth

Oops, I misread your comment, sorry. I thought you were talking about all electricity used in data centres worldwide.


> then after 10 years that would be 1°C,

or 1000 years if off in the other direction


You can't assume that the ocean is perfectly mixed. For example, desalination plants dump highly saline water into the ocean that can have adverse effects on local ecosystems, even though their effect on the entire ocean may be negligible.


According to https://en.wikipedia.org/wiki/List_of_countries_by_electrici... , total electricity production is about 24816400 GWh/year (~25 PWh/year, so you were 3 orders of magnitude off).

  $ units -t '24816400 GWh / (1347000000 km^3) / (1 kg/dm^3) / (4200 J/kg/K)'
  1.5908368e-05 K


Units is a wonderful program; don't be afraid to use their definitions (/usr/share/units/definitions.units):

  $ units -t '24816400 GWh / (1347000000 km^3) / waterdensity / water_specificheat'
  1.5851925e-05 K
There's even definitions like `oceanarea`, etc ;-)


And if there's 7.62 gigapersons on earth, that's about 3.2MWh/year per person or 271 kWh/month per person.

Since that's the same order of magnitude as, say, a US household, that does seem credible.


So that is to say, all the worlds energy production directed toward heating the oceans, could increase global oceanic temperature by 0.00001 degree C in a year.


Looking again, I think I got caught off by the "." being used as a decimal separator in some places.


One thing to note is that heat diffuses much slower through the ocean than the atmosphere - even if the impact on the average temperature of the ocean is negligible, local thermal pollution could still cause major issues. I don't know how much effect a pod of this size will have; hopefully Microsoft will fund and publish a comprehensive review.


I understand that this is just a back-of-the-napkin calculation, but I feel like the heating will have a more noticeable impact on the local environment (think 5km³), rather than considering all of our planet's oceans all at once.


> I understand that this is just a back-of-the-napkin calculation, but I feel like the heating will have a more noticeable impact on the local environment (think 5km³), rather than considering all of our planet's oceans all at once.

You're not putting all of the planet's datacenters within 5km^3 either. The calculation is fine.


You are indeed off by 1000 (assuming your inputs are correct)


I'm not surprised! Which way though?


Lower. You probably mixed kilograms and grams.


Now you can calculate by how much the water level would rise and how many square feet land would be flooded...


however will these values change depending upon the depth and hence more pressure?


>(I used 20 279 PWh) would heat up the oceans (1 347 000 000 km³ of water)

Your calculation is stupid because you dont need to heat up whole ocean, heating up small areas it will be butterfly effect, enough to badly affect bigger areas. Look at this:

>As the concentration of carbon dioxide in the atmosphere rises due to emissions of fossil fuels, more of the gas is dissolving in the ocean. That is lowering the pH of the water, or making it more acidic, which can make it harder for reef-building organisms to construct their hard skeletons.

Minor change in CO2 changed pH of water, which kills organism in wider area.

https://news.nationalgeographic.com/2016/03/160321-coral-ble...

There are also closed seas like Baltic that need over 100 years to fully mix sea water with ocean water, it's much less salty than other seas and warmer, also much more polluted from toxins that were sinked there during and after WW2.


>Your calculation is stupid because

No need to be rude. "There is an issue with your calculation because" would have worked :/


Not a subsea engineer, but having worked in the space previously I can tell you that we already have giant man made heaters sitting at the bottom of the ocean. Subsea Oil and Gas wells take advantage of the steady temperature at those depths to dump excess heat form the production fluid (sometimes up to 120F for straight crude, much higher for oil and gas mix) whilst and before it rises up to the platform. Sometimes using active or passive cooling on the ocean floor.

Then there's natural geothermal vents doing that, but probably an order of magnitude more, since before humans were around.

The ocean is a pretty big thermal mass.


This will never become a problem.

Take the mediterranean for example, a relatively small ocean. It has a volume of 3.75e15m3, with a mass of 3.75e18kg.

We need 4kJ/kg/K to heat up water. To heat the whole mediterranean by 1K, we need an energy of 1.5e19kJ, or 4.16e6 TWh.

In 2008, total worldwide primary energy consumption was 1.32e5 TWh, 31x less energy than needed to heat the mediterranean by 1K.


It cools back down when the water evaporates. But it doesn't do the oceans any good anyhow because thea heat also fuels algae blooms.

I doubt it will become popular because it involves waterproofing the enclosures, complicates maintenance staff access, and also carries the risk of water getting in and damaging the equipment.

There are other alternatives to AC. Using a cooling tower is a much better alternative to submerging the whole datacenter.


This is mostly aimed at large cloud providers such as Amazon, Google, and Microsoft. They can use units like this and take them up for scheduled maintenance every x years. What happens to some of the individual servers between maintenance is of less importance. The infrastructure can handle the failures and they can just turn off the affected servers if they cannot fulfill any role.


> Microsoft's Ben Cutler insists the warming effect will be minimal - "the water just metres downstream would get a few thousandths of a degree warmer at most" - and the overall environmental impact of the Orkney data centre will be positive.


I was wondering the same thing.

>> "Microsoft's Ben Cutler insists the warming effect will be minimal - "the water just metres downstream would get a few thousandths of a degree warmer at most" - and the overall environmental impact of the Orkney data centre will be positive."

It's not like we've heard that before about carbon dioxide emissions and other environmental pollution. Companies have mostly a different interests than the protection of our environment, from past experience.

I'm very interested to see what scientists and researchers think of this.


Wouldn't it be much smarter to recycle that heat energy? "Our goal is to recover and reuse all the heat produced." - https://www.telia.fi/yrityksille/english/telia-helsinki-data...


We're already heating the sea (and everything else) by running the air conditioners to cool the datacentres, and generating the electricity to power the air conditioners that cool the datacentres.

This is more a lot more efficient.

Perhaps there might be a localised effect, but I doubt the effects would be as severe as the carbon emissions saved.


All the big power plants use rivers for cooling. It doesn't seem to be a terribly big problem unless there is a drought.


Thermal pollution can cause significant ecological impact [1] when left unchecked; however, thermal discharge is regulated under the Clean Water Act [2], hence the usage of cooling ponds [3] and towers [4].

[1] https://en.m.wikipedia.org/wiki/Thermal_pollution

[2] https://www3.epa.gov/region1/npdes/merrimackstation/pdfs/ar/...

[3] https://en.m.wikipedia.org/wiki/Cooling_pond

[4] https://en.m.wikipedia.org/wiki/Cooling_tower


Or summer is a bit hotter than anticipated, in which case fish are free to go ahead and die.


I'm guessing it is less than a drop in the ocean given that the ocean makes up 3/4 of earth. Also don't forget that the earth core is hot as hell. I don't have exact number but my guess will be much less than a drop in the ocean if all datacenters were pushing heat to the ocean.


> At what point does this become a problem? They're using the naturally cool sea water to regulate the temperature in the container, but that heat needs to go somewhere... it heats the water.

Makes as much sense as the people who think that windfarms will eventually stop the wind, or people who worry that smoking is contributing to global warming.


How much of a problem than all the lava flowing into the ocean from the Hawaiian volcano?


Time for outerspace data centers with ultrafast space elevator strato-cable connections.


Counterintuitively, cooling is really difficult in outer space. It's cold, but there's no air or water circulating to remove the waste heat by convection. Instead you have to rely on radiating the heat away, which is much less efficient.


I have the impression "cold" is kind of ambiguous in space, because while there are very few particles, the ones that are there are moving quickly, which could be seen as a high temperature.

See: https://en.wikipedia.org/wiki/Thermosphere


Never thought of that. Nice point!


If the heat becomes enough, couldn’t we recycle it?


No, because you can not convert heat alone into energy. You need a heat difference (something hot and something cold). And the efficiency of the conversion depends on the temperature difference.


You could use it to heat something up that you actually want to be warm.


Right, but if you’re putting it in a place that is naturally cold, you’re going to have a temperature differential you can then work with to recycle the heat, wouldn’t you? Obviously you’re going to be losing heat still, but you could reclaim some of that energy.


Yes, in theory you could do that, but the efficiency depends on the temperature difference. The higher a temperature difference you have, the better. The theoretical maximum efficiency of a heat engine is (Tmax-Tmin)/Tmax (temperature in Kelvin). So a few degrees is not going to result in anything usable. So to reclaim the energy, we need a large temperature difference.

But since the goal is to cool the datacenter, we want the temperature difference to be as small as possible.

Reclaiming energy from waste heat usually only pays off in industrial settings. If you have very hot steam, you can use it to power a steam turbine and generate electricity. But at lower temperatures, the efficiency is too low.



> Prof Ian Bitterlin, a data centre consultant for nearly 30 years, is sceptical about the environmental impact of going underwater.

> "You just end up with a warmer sea and bigger fish," he says.

> And 90% of Europe's data centres are in big cities such as London and Madrid because that is where they are needed.

> Microsoft's Ben Cutler insists the warming effect will be minimal - "the water just metres downstream would get a few thousandths of a degree warmer at most" - and the overall environmental impact of the Orkney data centre will be positive.

Both of these opinions feel like exaggerations but I don't know enough about thermodynamics to know what the true answer is. I have a feeling that the Microsoft opinion is much closer to the truth, can anyone help me understand how I'd walk through the numbers?


Not an expert either, but this might be a good way to think about it: on land or at sea, the machines produce the same amount of heat. On land, in general, the environment around the machines will be much warmer than the desired temp. Cooling on land is in general an active affair, which has to do with generating and controlling flow of fluid and/or maybe some kind of thermic reaction. Potentially at sea being plunged in cold water is sufficient to reach the desired temp, and cooling becomes a passive affair. All in all you are trying to displace the same amount of heat but through a more efficient medium. Correct me if I'm wrong.


Oh, I'm sure that being in a cold environment and letting it passively cool you is more efficient!

What I'm more unclear on is how small the impact is. It seems very likely to me that throwing the waste heat into the bottom of the ocean has less of an environmental impact than running A/C to suck out the heat and dump it into the air.

However, is the impact really so small that you wouldn't notice it just meters away? That's the part I'm unclear on.


I think so yes. Water has many times greater thermal conductivity than air, so the heat will spread out across a larger area faster. Meaning a very small (negligable) temperature increase at any given location. You wouldn't be able to detect it from a few metres away.


I'm trying to see how the efficiency is gained and I just don't really see it. I guess you don't need to pump water for water cooling so you save the pumping energy, or maybe lower fan speeds or something, but it really seems like it would be a minimal amount. What it does seem like is a cheap way to build a water cooling system that doesn't really need to be maintained. It seems like switching to arm would be a much bigger efficiency gain, but nobody cares because energy is cheap.


Switching to ARM might be a bigger efficiency gain, but nobody wants arm. Azure already offers ARM servers. Moreover, these aren't exclusive - one can easily sink an ARM server just as easily as you can sink an Intel one.

But the efficiency gains are such: Air conditioning is actually inefficient, especially with air cooling. Water cooling is more efficient. Water cooling traditionally uses water as an exchange medium, but is eventually water-cooled-by-air anyway, just in bigger batches. Here, they can take in new cold water and throw out old hot water without actually bothering with any air exchange at all. Or at least, that's the plan. Maybe it'll work!

(If "nobody cares because energy is cheap", we wouldn't have tried this in the first place.)


Usually DCs use AC for cooling, which needs electric power, which needs far away power plants and tranmisions lines. Here, you have a practically endless source of cool water that you can just pump directly.


> It seems like switching to arm would be a much bigger efficiency gain

That's a false dichotomy. More energy-efficient CPUs don't preclude efficiencies from cooling.

I suppose the case could be made that CPUs account for the vast majority of datacenter cooling needs and that ARM efficiencies would eliminate so much of that need that any efficiences in the cooling itself would not be worth anywhere near this kind of cost. I'd expect some pretty extraordinary evidence backing up that argument, since those would be pretty extraordinary claims.


I don't understand why not locating it in a city in a cold climate and using the waste heat to heat the city buildings?

There is precedent. In Seattle there's the old "steam plant" which piped steam to many local buildings to heat them in winter.


This may be more common than is often realized. Also in Seattle, we (Amazon) use heat from the Westin colo/datacenter to heat some of our office buildings ... https://www.greenmatters.com/news/2017/11/22/Z2tmzwQ/amazon-...

About ten years ago, before working at Amazon, my last job involved building a datacenter in Leiden, the Netherlands. There the city has a municipal heat exchange program and we could also vent the excess heat to be used for heating water.

Modern data centers, especially for Cloud services, are really really really big though ... so big that they have specialized real estate and power requirements. The locations where you can get that much power, and that much space, tend to be outer sub-urban or quite remote. In those locations, there are few consumers for excess heat so more effort goes into reclaiming the energy loss through other means.


You're talking about District Heating[1], and that's been around for decades in hundreds of cities. It's the source of the steaming manhole covers that are a fixture of noir imagery.

Denver's got the oldest continually operating system in the world, and within the last decade or so, they added a cooling loop, as well. Instead of a boiler and a cooling tower, you can subscribe to a steam loop and a chiller loop.

The problem is once again of gradient. It makes thermodynamic sense to pipe steam around, because of the large gradient between steam and ambient. But servers don't make steam. At best they make warm water only a couple of dozen degrees over ambient... and warm water doesn't have enough energy to heat buildings very effectively.

That said, and as someone already mentioned, people are doing it anyway. Seattle's internet exchange pipes its water across the street to the Amazon towers[2].

BTW, Seattle's Georgetown steam plant might not be making steam any more, but the one down by the market is still operating as Emwave Seattle[3].

[1] https://en.wikipedia.org/wiki/District_heating

[2] https://blog.aboutamazon.com/sustainability/the-super-effici...

[3] https://en.wikipedia.org/wiki/Seattle_Steam_Company


> At best they make warm water only a couple of dozen degrees over ambient

Consider that geothermal heating systems are based on the ground having a temperature of 55 degrees. The difference between that and cooler ambient is used to drastically reduce heating bills.

For example, if it is 30 degrees outside, that is heated to 55 by the earth, then the building heater only has to boost the 55 to 70 rather than 30 to 70.


City heat requires building out quite a bit of infrastructure, then in summer you still have to get rid of it without pumping it through the buildings. Even Inverness doesn't really need heat in summer.


Hi Walter.

Also.

I'm assuming it has to do with distribution and reliability within regions, hence why you even need datacenters at different locations and not just one location.


I wonder why they are submerging these instead of just making them a floating barge.

I can't imagine submersion is significantly better for cooling, the energy density very likely already requires a circulating water system, and the added issues dealing with the pressure at depth seem risky.

Possibly there is a benefit not related to cooling. Perhaps weather/waves? Things are probably alot calmer 30ft below the surface. The tossing and turning on the surface may place additional streses on things like HDD spidles.


HDD spindles????


Spinning hard drives have a decent gryoscopic effect. Hold one while it's powered up and try rotating it in a manner that would change the plane of the platters and you will feel the resistance.

Each time you do that, it puts a slight load on the spindle bearings. Do it repeatedly on a varying 5-30 second cycle and you'll simulate what a harddrive on a boat or barge in the open water would experience.

I can imagine that would create additional wear and tear and contribute to an increased failure rate.


Right but who is deciding to use spinning hard drives instead of SSDs in an underwater data center?


Cooling fan spindles are subject to the same forces; and they had a large array of cooling fans.


I doubt they're going to use fans if they are pumping the air out of the thing.


Here's some more information from the blog post: https://news.microsoft.com/features/under-the-sea-microsoft-...

And here's the project site: http://natick.research.microsoft.com/


Wow. That is super interesting. Thanks for sharing.


I wish somebody worked on using that unwanted heat, like they do in swimming pool / ice rink combos, rather than finding ways to just disperse it faster in the environment.


It's tough in hot climates. There's plenty of heat to go around. And there are lots of people who live in hot climates who, for latency reasons, need servers near them.

Thus there are kind of two cases to solve for. You can stick a data center in Northern Europe, and that heat might be valuable enough that your approach could be to try to reclaim it. If you stick a data center in Singapore, you'd better focus on generating less heat in the first place or finding better ways to get rid of it.


Even in Singapore the max sea temperature is 32C, which seems cool enough to cool a data-center.


Or perhaps heating a greenhouse in a cold climate where fresh food is expensive or impossible to import.


Arthur C. Clarke wrote a short story about an power generator based on the temperature difference of a long pipe reaching into the cold ocean depths. I think the story was The Deep Range.


Toronto cools some of the larger office buildings downtown by pulling in cold water from the bottom of Lake Ontario.

http://www.acciona.ca/projects/construction/port-and-hydraul...


my local swimming pool is heated by waste-heat of a nearby powerplant


I wonder if the water would be hot enough for a thermal bath/spa?


Some non-environmental reasons for wanting to do this:

http://www.alexwg.org/publications/PhysRevE_82-056104.pdf

> Optimal intermediate trading node locations for all pairs of 52 major securities exchanges, calculated using Eq. 9 as midpoints weighted by turnover velocity... While some nodes are in regions with dense fiber-optic networks, many others are in the ocean or other sparsely connected regions.


I don't think this is done for the sake of efficiency, but rather latency.

For example, if you had a underwater data center that sat in the middle of the atlantic ocean in between new york and london, you could do some serious trading with that capability.


No, you could not. You would gain nothing over having a datacentre in London and a datacentre in New York and a low latency link between them.

You can note, for example, that there is no data centre half way between Chicago and New York, even though that area has cheap accessible real estate and billions of dollars have been spent on low latency communication links between the two.



For those confused by this like I am, the key part to look at is the "emulation" paragraph.


Even in the coast of large cities, like NYC or Boston. The real state cost of having a data center in those regions are high, but it would be much cheaper to just hide your servers underwater, near where your users are.


This is a big reason. The cost of leasing out space is definitely more expensive than the cost of cooling these datacenters.

Why spend several million dollars in leasing space when you can drop a datacenter capsule off the coast with free cooling? Who cares about the cost of hardware.


The cost of connecting the thing via submarine cable to a suitable land point is likely to be a lot higher than you think.


Do you have a source to back this statement?


So you’re going to put servers in a public waterway and you don’t think the US Government is going to have a word with you about it?

You’re going to pay even if it doesn’t have a street address.


However, Orkney isn't in the middle of anywhere, it's some distance even from Telehouse London.


Closer to Norway than London!

Of course, Orkney was part of Norway before becoming part of Scotland in 1468.


TIL. Those pesky Vikings.

My cousin, from Indiana, is currently serving as a minister on Stronsay. The scenic photos he's sharing are certainly encouraging me to contemplate a visit.


My great grandfather was skipper of the Evangeline who died along with all the crew off of Stronsay in 1909:

https://www.youtube.com/watch?time_continue=27&v=0UVWUbJp3Ys

http://www.axfordsabode.org.uk/disaster/evangeli.pdf

Edit: (Can't work out if was great great grandfather or not! - but certainly some ancestor!)

Edit2: I'd definitely recommend visiting Orkney - it regularly gets evaluated as one of the nicest bits of the UK to live, the scenery is fantastic and it has some of the most amazing historical artifacts in Europe.


Thank you, that document makes for compelling reading.


I wonder what will be algae/flora impact on a heat exchange after year or two. This thing will be warm and stationary so it will became overgrown really fast.


I imagine the first step would be to use standard marine anti-foul paint on the exchangers. But that only works for a few months at best.

After that, I'm not so sure. Perhaps a slow, continuous movement of an exchange surface past a hard surface would do the trick. When this stuff is young, it's easily wiped away, but once it's there for a while you get real problems.


We could look at hydrothermal vents to see the effect of heat source underwater.


The heat from hydrothermal vents on its own wouldn't support ecosystems in deep water, they also feed out various useful chemicals and nutrients. The base of these foodchains is creatures that can build biological carbon stuff using chemical energy from that instead of using light, called "chemosynthesis".

https://en.wikipedia.org/wiki/Chemosynthesis


Well everyone is talking about the energy usage, which is the point of the experiment.

But what happens after 5 years (the expected life of the datacenter). Most of the computers in there will be worthless then. Will they bring up the datacenter and reload it with new equipment? Will the cost-benefit be in favor of re-equipment or just sink a new one in? If its going to be cheaper to just drop a new one in, we will have ocean floor littered with dead datacenters.

Haven't heard anything about this.


Even if the equipment is outdated and broken, the scrap might be worth something to someone.


I assume the shell and racks will still be worthwhile in 5 years


If you want maximum energy efficiency, build city centers in cold climates where the excess heat can be used. Yandex has data center in city of Mäntsälä, Finland and they provide 50 percent of the heating needs.

Even more advanced concept, combined district heating and cooling. With combined district heating and cooling and integrate data centers into them. Data center heat can be used in cold regions for heating. Combined heat pump/chiller units can produce both heat and cooling at the same time. Sea water (trough absorption chillers) is used for cooling apartments, offices and data centers. Heat from data centers, purified waste water, etc. is used for heating apartments. During winter more heat is utilized. During summers more water cooling is utilize.

https://www.theguardian.com/environment/2010/jul/20/helsinki...

Suvilahti Data Centre New Build Case Study: Operationalizing District Heating and Cooling in large Data Centre in Former Electricity Station http://www.energy-cities.eu/IMG/pdf/WS2_Helsinki.pdf

https://www.helen.fi/en/company/energy/energy-production/pow...

COMBINED DISTRICT HEATING AND COOLING https://www.euroheat.org/wp-content/uploads/2016/04/Case-stu...

District Heating & Cooling in Helsinki http://www.iea.org/media/workshops/2013/chp/markoriipinen.pd...


It's all very clever, but think about the cost and complexity. Installations, management, real estate, physical security etc.

I'm quite confident that even making optimal use of the excess heat you would end up with less money than with their solution. I don't think companies want maximum energy efficiency. They want maximum cost efficiency.


Exactly, cooling is expensive but not as much as human resources. With a sinkable data center, you could put it next to any coastal or riverside city.


The technology is already proven and saving money.

> physical security etc.

Physical security is not a problem. It's not like the city utilities manage the cooling of the data center. There are heat exchangers between. If there is problem with city utilities, the heated sea water goes back to the sea.


So flooding a server starts to mean something completelly different


What would be its ecological impact? What if there are more of these data centers?


Is there a benefit over just putting it in a boat/coastal building and using water (perhaps pumped from some depth) for cooling?


Boats in that part of the world are subject to severe stresses because of the weather. And boats are always moving, even in harbour. Just maintaining a reliable cable connection to a boat is more challenging than connecting to a subsea container. Placing the container on the sea floor avoids almost all of the weather related problems.


Microwave links? How about an oil derrick style thing mounted directly to the seabed so it doesn't move?

There's also plenty of coastline away from major cities. Find enough open space and it becomes feasibly multi-tenant. Add wind/solar and run a cable (or 3).


The closest thing I remember is Google taking over an old paper mill in Finland that used a condenser sunk into a canal connected to the Baltic sea.

https://www.wired.com/2012/01/google-finland/ there should be a video with the details here.


There's also a data center in Helsinki, Finland that is cooled by sea water and the waste heat produced is piped via a heat pump into the district heating network. Here's a press release from 2011: http://www.investineu.com/content/atos-builds-world%E2%80%99...


This, it feels like an awful waste to dump the heat straight into the ocean. District heating networks are a very efficient way to heat up a city, and they can be used as a place to dump excess heat. Further, we've now also got district cooling, http://basrec.net/wp-content/uploads/2014/05/District%20Cool... [PDF warning]


Putting in on/in a boat or near the coast might get you the seawater for cooling, but the project lists a few other big benefits like oxygen-free atmosphere, constant temperatures year-round, and ease of deployment.


A hermetically sealed capsule can be placed on a boat just as easily as a submarine, if not easier due to the pressure difference.

I'll give you the constant temperature but this project is about using the nearly-free ocean to maintain temperature so surely the delta isn't too large here.

I don't see how this is easier than a boat at anchor.


Imagine a future where whenever the energy price drops in a region suddenly companies show up with a bunch of such tubes drop them in the ocean. Do some batch processing and once the energy price rises they pull it out and move on.

In reality of course connectivity would be a major problem and energy price differences are probably not large enough to make it viable.


Although it doesn't have anything to do with computational load, hydroelectric storage exploits the change in electricity price to store energy in hydroelectric systems. At low demand they pump water into the reservoir, and at peak demand they release it again through turbines, generate electricity, and sell it at the higher price.

https://en.wikipedia.org/wiki/Pumped-storage_hydroelectricit...


Powering down some facilities during temporary spikes would make sense (someday if hardware is cheap enough, operating only for peak solar generation/minimum local demand could make sense). Being able to relocate due to cost/legal/etc. over a few month period could make sense, or the special tax and planning/regulatory treatment a "ship" would get vs. a building on land.


Putting your datacenter in international waters, without laws or regulations, hidden from the world. Such a great way to do things without people knowing, or having people do things about it.

Power it with offshore windmills, and you really only need a network connection.

Great idea though, using water to cool your datacenter and leaving all the oxygen out.


Maritime law; there is no such thing as international waters for “stuff” each vessel has to fly under a flag.

The MSFT Datacenter is a vessel under law, if they don’t want to fly it under a flag they’ll have to essentially abandon it and then salvage laws come into play.


In that context, the Internets fascination with pirates makes a whole new level of sense. Servers, classified as marine vessels, admins being literal captains sailing under flags.

In a more fun world, one of those vessels would fly the Jolly Rogers and host a certain infamous Swedish bay.


If your data center isn't registered and flying a flag, won't it be salvage and belong to whoever recovers it?


Yeah but what are the odds of somebody finding it?


Just follow the cables!


Why would the physical location of the server matter with respect to people knowing what you are doing? I have no idear where googles servers are when I fire off a search for cute cats in boxes, yet I still know what it’s doing.

As for “without laws” google still follows European laws even if my search happens to be handled on an American or British machine. You obey the laws where you operate not where your server is located.


For companies who are already complying with laws in whatever jurisdiction I'd have to agree with what you are saying.

But if the economics in the future make sense on some level, I could see criminal enterprises interested in exploring things like this. If narco's are building/using submarines now, I could see them managing/tracking/monitoring a global logistics from their sunken data centers (they'll need to find other ways to power and transmit/receive, because I doubt they could just hook up their cables directly onshore anywhere, setting aside being an easy choke-point to cut access).

Maybe future narcos, illegal EU personal data miners/brokers, pirates or who else for whatever could be interested, esp if the fixed cost of operating stuff now/future > operating/ future sunken data centers and probability/cost of seizure now/future > probability/cost with future sunken datacenters?


Let's hope this takes off and leads to regulating the oceans.

For example: industrial, unregulated fishing destroys coral reefs every day. They are thousands of years old and won't come back.


> It will not be possible to repair the computers if they fail

This is the part that concerns me the most. Pretty much all DCs have multiple 24/7 staff to deal with hardware failures and equipment swaps... telling a client "you can't access your hardware for 5 years" wouldn't go over too well.


Well, you clearly wouldn't lease dedicated hardware with this model. You just redistribute load amongst data pods when there is a failure, and pull the whole pod from service for refurbishment once it passes your total failure threshold.


Think about it in a cloud model where you aren’t renting a server you know by name. Imagine running something like object storage or FaaS where the cloud provider handles everything behind the scenes and can failover at any time without you seeing more than perhaps a few failed requests.

In that model, hardware failing is just a factor in total overhead cost. If the hardware doesn’t fail immediately it might be cheaper to leave a dead node in the rack than to pay a human to touch it, especially if they’ve already recouped a significant percentage of the purchase cost by the time it fails. Over the life of a server the cost of cooling is enough that a substantial savings will push that breakeven point earlier.


> This is a tiny data centre compared with the giant sheds that now store so much of the world's information, just 12 racks of servers but with enough room to store five million movies.

How much is that in Library of Congress units? This trend of not giving out data but appearing to do so is strange.


"The Phase 2 datacenter can house 27.6 petabytes of data."

Assuming you're using 15TB as one LoC unit, then about 1,800 LoCs.


Interesting idea but the maintenance must be painful. Let's say a hard disk fails, what do you do?


It's a fail-in-place data center. If a hard disk fails, you don't do anything. They used to replace telephony equipment all the time too, now it's just a locked room in a building. This is pushing data centers to be more hands-off.


So they don’t do anything? Eventually you’re just maintaining stacks of errored and broken computers.


By then you've replaced the datacenter and you can shift the workload and then salvage it.

In theory you'd only leave it down there for three years anyway before everything in it is worth zero, at least to the IRS.


Yeah. Fail in place over the service lifetime. When enough stuff is failed that it isn't worth keeping down there, you pull it up and refill it.


Based on the picture, "pulling it up" looks like a fairly intensive task, requiring boats, persons, etc.

That kind of thing eats directly into the ROI for a datacenter. I doubt it competes with a static building with a bunch of solar panels on top.


Might just fail-in-place, then. Although that looks pretty bad from an environmentalist perspective.


Article says no maintenance - if it breaks its broken.


Overprovision with cold failover.


As an aside in the article:

> There has been growing concern that the rapid expansion of the data centre industry could mean an explosion in energy use. But Emma Fryer, who represents the sector at Tech UK, says that fear has been exaggerated.

> "What's happened is we've had the benefit of Moore's Law. The continued advances in processing power have made doom-laden predictions look foolish"

There may be other reasons that energy efficiency will continue to improve, but Moore's law, (More specifically in relation to performance per watt: Dennard Scaling), has long been at an end. Given her position it's fairly ignorant to sight this as a reason for a continued lower proportion of energy consumption growth by now.


> In a normal year, demand for electric power in Chelan County grows by perhaps 4 megawatts ­­— enough for around 2,250 homes — as new residents arrive and as businesses start or expand. But since January 2017, as Bitcoin enthusiasts bid up the price of the currency, eager miners have requested a staggering 210 megawatts for mines they want to build in Chelan County. That’s nearly as much as the county and its 73,000 residents were already using.

Source: http://www.thenewstribune.com/news/business/article212008409...


Interesting idea. My hunch tells me that they’re underestimating the corrosive nature of salt water. I also wonder about vibrations, which are much more pronounced underwater.

But I guess we won’t know until we try!


I'd be curious to know the specific benefits of the Orkney Isles to this project; when I hear the renewable research there being referenced (by family there, some of whom are directly involved) they always claim the major benefit is the close proximity of different underwater conditions, with some of the greatest range in tide.

Yet I assume this isn't particularly useful for a datacentre of a well understood shape that doesn't intend to use the conditions to create power through novel designs.


That's true specifically, but orkney has generally become a hub for renewables research, so the snowball effect may be in play here.


Orkney also has an electricity surplus due to an inadequately upgraded connection to the mainland.

There's a proposed upgrade to link to the grid at the nearby Dounreay nuclear power station: https://www.ssepd.co.uk/OrkneyCaithness/


If clusters of these tanks are eventually deployed, it could have some surprising effects on local marine life and might be a good way to jumpstart a marine preserve. [0] [0]https://www.abcactionnews.com/news/local-news/clusters-of-ma...


This opens quite an interesting perspective for 'off-shore' hosting, literally. I remember from a previous gig a while ago that renting servers in Saskatchewan or something similar provided companies with a favourable legislation as well as cheaper server cooling. If you plunge a data centre into the middle of the sea somewhere in international waters, does it really fall under any legislation?


No legislation cuts both ways. Be prepared to fend off pirates and state actors attempting to claim free loot.


I can see a few robotic hands doing magic under the sea while being operated remotely. Submarine drones deliver the hardware. I can see this helping MS to move into the space and a huge boost for robotics. It will get there eventually, I think the no maintenance is just to test if it is worth it before going all in


This can't be a new idea. Or is it that it's only now become possible to do such a thing?


I remember reading google was doing the same experiment some 10 years ago.


Microsoft boiling the oceans is a positive move for environmentalism, hybrid owners agree.

More

Guidelines | FAQ | Support | API | Security | Lists | Bookmarklet | Legal | Apply to YC | Contact

Search: