(source: https://www.bbc.com/news/technology-44368813 -- see video)
edit: just to be clear, my questioning isn't meant to be read in a denigratory way. just wondering. Also thank to Kydlaw for pointing out that it actually said "up to five years".
Based off https://natick.research.microsoft.com/
Teddy, Vicky, Paulette, Sally and René?.
Better playing safe than sorry, I suppose
While I do think we are seeing the limits of what a publicly-traded company can actually realize, there are of course counterexamples: Google always seems to have multiple irons in the fire, and there are many other examples in the comments on this discussion of corporate research labs: https://news.ycombinator.com/item?id=24200764
This is oft-parroted without any hard evidence though. If it were true even broadly, most companies would be doing absolutely no R&D because all R&D is just cost in the near term.
However, it’s not clear that Microsoft’s next version of windows for example is actual research rather than the software equivalent of making a movie. As such I think what people are talking about fundamental research not the kind of R&D which happens to qualify for a tax break but is mostly just the cost of doing business.
IMO, the line for what still qualifies for research is basically the DARPA self driving car challenge. Before the event it looked like basic research, afterward it looked like an engendering challenge to get there first. In 2004 nobody finished though several got close in 2005, in 2005 five teams finished and the race for commercial success was on.
If not much has happened after 2 years we're good to go. We need to get into the future as fast as possible.
If the concern is that you'd see significant failure after 5 (which is likely) which could undercut your future plans, I could understand cutting the plan short.
It's quite possible that failures at the 5 year mark wont maintain a proportionate ratio with failures at the 2 year mark when compared to control. Maybe drive "A" tends to have an unacceptably high failure rate only after "X" hours of up-time.
That does sound plausible. But I do wonder how much might have been due to extra care. If I were the sysadmin on the project, I probably would have spent extra time on component selection, cable seating, burn-in testing, etc. Lots of pressure for it to do well.
Edit: Unrelated, but this picture is funny to me. I don't think there's enough room to slide that server out, so I'm not sure what he's doing. https://ichef.bbci.co.uk/news/800/cpsprodpb/48D6/production/...
"there were no humans on board"
Made me think of this:
"The factory of the future will have only two employees, a man and a dog. The man will be there to feed the dog. The dog will be there to keep the man from touching the equipment."
I assume the same dynamic is at play, but I find the entire thing to be riddled with hubris, and projective of industry for industry sake.
The first thing I thought of when reading this article is whether the increased shielding from the water reduced the impact of cosmic rays on the hardware.
It's also a really good random number generator, for those Hackers/crypto people that would like a true source of randomness.
I think (but have no reference) that the amount of cosmic rays the planet blocks by being in the way is dwarfed by the effects of the magnetosphere and solar rays.
Back when I worked at a supercomputing center, we had "operators" on duty, who were supposed to visit the machine room every 2-3h or so and check several things.
It turned out that they were the major cause of hangs and reboots of our SunSITE server (a large FTP archive) — walking on the lifted datacenter floor caused vibrations which were enough to disturb the (terrible) external SCSI connectors to multiple drive arrays.
So, I can certainly believe that statement.
Maybe securing the vertical bars after the entire thing slides horizontally into the tube?
I'm assuming the tube has some rails they use to slide the racks in but then they have to be properly secured in place.
Needless to say whenever I was there I kept having flashbacks to the first Resident Evil movie.
(Of course, individual projects still have finite budgets.)
(To be clear, I'm speculating without any real knowledge of this subject, and welcome the inevitable corrections.)
I want to talk with aliens as much as the next guy, but so far it seems like the vast majority of the universe is a vacuum--biologically speaking.
Which is to say: you’re thinking on the wrong timescale. Just wait and see. :)
Apart from being drawn as particles on diagrams, they violate every rule of what a particle is supposed to be.
There are formalisms where they aren't necessary, for example quantum lattice models.
On earth, matter only rushes to fill vacuums because the surrounding air or water pressure pushes it in.
My limited understanding was that 'pressure' is just a simplified way of speaking about statistical mechanics.
Maybe not. Barnacles probably conduct heat at a similar rate to water. And they create a rougher surface with greater contact area to the surrounding water. And some of them actively filter water, push it around. Perhaps having a layer of barnacles woudl increase cooling.
Barnacles and other accretive life forms means that the thermal gradient drops sharply, and hence the heat flow rate drops significantly.
Take a comparison is between (say) 25C heatsink in contact with 10C water across a boundary layer that likely on the order of 10mm thick, versus what is effectively a static fluid ~50mm thick followed by the same boundary layer.
So the unmixed fluid is going to be ~6x  thicker, for a heat flow rate 1/6th the original. And that's assuming that barnacles do in fact conduct as well as water, something that's unlikely (water is ~0.6, human flesh ranges from ~0.2 to ~0.5)
 Insert a bunch of unstated assumption about the fluid flow rate, salinity, etc etc.
I'm not sure how the MS Natick's cooling system works, but if it's anything like a ship, the heat transfer surface is NOT typically the skin of the hull (although keel coolers are essentially that, but I don't see keel coolers on the Natick hull). Instead, ship cooling systems typically suck in the virtually endless supply of cold seawater, run it through a heat exchanger (often a plate type), generally with a feshwater loop on the other side that runs to hardware to be cooled. The servers would be cooled by this fresh chillwater loop.
The above system only works if a.) you have much excess cool seawater than heat infusion (i.e. you're not trying to cool a large server farm in a stationary pond or small lake), and b.) You have means to clean the biofouling that will occur on the seawater side of the heat exchanger if left unabated. The latter, as indicated by reduced cooling capacity, may be why MS needed to stop the project 3 years shy of the objective completion date.
As for the noise, that can have a negative impact, but it's going to be nothing compared to the noise ships are making: https://www.npr.org/2020/07/20/891854646/whales-get-a-break-...
Whether it's high-pitched or low-pitched noise also makes a huge difference underwater.
(and yes, it's a damn shame the oceans are so noisy due to ships and sonar...and you're prob right about heat not being harmful, I'm just curious what you're referring to)
Mostly the sun, but also warm freshwater rivers (one of the biggest temperature differentials you'll find excluding hydrothermal vents). Areas with a lot of organic material (algae etc.) close to the shore can also get very warm compared to the temperature average. Though I'm not sure about the cause/effect for the latter.
One of these might not be a big deal, but a thousand? Worth questioning.
This is a valid point for small streams, whose ecology can't adapt when the temperature changes mid-stream.
For oceans the picture is a bit different though.
Suppose we're looking at a cubic kilometer of seawater that has a temperature of 10C. The entire output of a nuclear power plant (single unit, about 1200MW) would heat that water by less than a single degree in a month.
The average power consumption (and thus roughly heat generated) of a single server rack is somewhere around 12KW, so you can power about 100.000 of these for that. Microsoft's submarine had only 12 racks. From this one could conclude that even the localized effects are likely to be minimal.
Now for perspective: the entire ocean has a water volume of roughly 1.35 billion cubic kilometers. Next to major energy sources like the sun, and even smaller ones like hydrothermal vents and streams of warmer water entering it, your puny server submarine is not going to be noticed. With a nuclear power plant worth of energy you're heating the whole body for just about 1/100.000.000th of a degree per year, assuming the heat wouldn't dissipate out of the water at some point.
And really, if you're running your servers outside the water, the heat would dissipate into the ocean at some point too, making it a moot point in the grander scheme of things.
So the only area of interest concerning marine life is about 5 meters in every direction from your server tube. I'm willing to bet it'll have way less of an impact than a warm freshwater river discharging into the ocean.
I don't know anything about ecology, but my intuition is that all of these things from wind turbines to data centers under water, have an impact on their ecosystems. It wouldn't surprise me if we found out these had a negative (or neutral) impact on their environments.
I guess my point is that it seems naive to simply hand-wave off the possibility that these supposed environmentally friendly technologies actually negatively impact their environment. Whether or not that negative impact is less than the alternative is an interesting question.
We have a freakin' enormous open thermonuclear power plant beaming on us since the beginning of days. Just don't impact the planet's ingress/egress ratio with that CO2 and it will all turn just fine!
Heat - not a problem now.
Greenhouse gas - big problem now.
There are many islands and archipelagos in Scotland, the Hebrides which are closer to the mainland but still out of the way of fishing.
Many of them are going to be setup with tidal and wind generators.
Would be interesting to see how it was tethered to the shore with networking and power.
That I can think off the top of my head:
- Orbital marine, making tidal turbine https://twitter.com/Orbitalmarine
- Their grid https://www.youtube.com/watch?v=FXe1hBvlylw
- Their Hydrogen facilities, using extra renewable https://www.youtube.com/watch?v=Rybpaqhg5Qg
I suspect there is simply more infrastructure for this kind of thing in Orkney that the other islands.
IIRC, it gives coral a place to grow.
And I meant to say that it’s good for fishing, not shipping. My bad.
But honestly who cares if your power comes from renewables in the first place -- solar and wind? It doesn't seem right to even frame it as "wasting" heat in the first place, anymore than the sun's heat was being "wasted" warming up the ocean in the first place.
I'm sure smaller scale data centre heat to district heating schemes must already be in place. Fundamentally you are using the same technology to cool the data centre (a heat pump), just pushing that heat into hot-water / steam, rather than dumping into the air.
Here's a source (in Danish). Wind energy in -> 25 MW/h of heat out (to heat up 12.000 houses):
I don't know what you'd call large-scale, but the school I went to was entirely heated by the datacenter across the road and the DC still had plenty of heat to spare. Afaik they heat all of Science Park Amsterdam with the Equinix DC just to the east: https://osm.org/go/0E6VOVkQc-. I think it was a requirement of the local government that this waste product be reused. Amsterdam has also had a datacenter stop due to power supply issues.
Edit: this seems to be the press release (only source I could find): https://www.equinix.com/newsroom/press-releases/pr/122801/eq... "Third data center in amsterdam" matches that this DC is called Equinix AM3, and it says "Excess heat from the data center will be used to warm nearby buildings and for other third-party uses."
Not to say that this happens with all the data centers across the world, but if using a decent chunk of a DC's waste heat isn't a large-scale heat recovery project, then I'm not sure what would be.
That's why you don't see it...
- Technicians were extra careful (slow) when installing the equipment.
- The datacenter pod used no recycled parts. Traditional datacenters are full of recycled marginal-quality parts. Maintenance teams balance the cost of buying new parts, the cost of testing used parts, and the labor & downtime costs from recycled parts failing.
Extreme reliability is already achievable but not economical. One reason why Google Search beats Bing is that Google's infrastructure software is more tolerant of flaky hardware, so Google can spend less money on hardware maintenance, reducing the cost per search.
Hopefully Microsoft will release a report and tell us the source of the underwater datacenter pod's low failure rate.
>The team is speculating that the greater reliability may be connected to the fact that there were no humans on board, and that nitrogen rather than oxygen was pumped into the capsule.
>"We think it has to do with this nitrogen atmosphere that reduces corrosion and is cool, and people not banging things around," Mr Cutler says.
Better cooling and isolation from all kinds of radiation seem like they'd also be beneficial.
One of the facts left unmentioned is that this was built and operated by Naval, the French state-owned submarine and shipbuilding corporation.
That disease is very prevalent, and I don't understand why.
My less-cynical guess is that the industry is stuck in the past, and journalists need better training and tools.
My more-cynical guess is that they are afraid of irrelevance, so they are defensively trying to keep you in their walled garden of information instead of encouraging you to get into the habit of getting info more directly.
Sort of like how we try link to the original YouTube video.
I think with actual articles written by presumed journalists, linking to source data is what establishes the credibility of the author’s writing and suggests they have read and understand the content.
Not linking to it doesn’t mean the author doesn’t understand it, but it may mean their work does nothing more than regurgitate (adds nothing of value apart from increased distribution)
When you consider the significance of undersea cables to the global economy, the propensity of states to intercept them covertly, the difficulty of attacking or even finding submersed compute, and so on, the ramifications are significantly greater than 'green compute'.
It's like the main function of the site is to trap you.
I first saw this on Facebook but I have since seen even sites that I used to respect follow this same pattern.
To me this is nothing but another dark pattern.
I was left wondering whether they were referring to the project, or just Azure Availability in general. /s
I doubt there's a "Orkney-Underwater" region.
"Natick was used to perform COVID-19 research for Folding at Home and World Community Grid."
Also the data center designation for it was "Northern Isles" (SSDC-002).
Also nice if they discussed the energy/cost involved with deploying and retrieving these capsules and how well that would amortize if this became a commercial solution.
Develop one full scale prototype subsea datacenter, which could be used as a modular building block to aggregate subsea datacenters of arbitrary size
Gain an understanding of the economics of undersea datacenter TCO (total cost of ownership is the full lifetime cost of a datacenter including manufacture, deployment, operations, and recovery) should we proceed to commercial deployment.
Seriously though, what is the direct ecological impact of doing this at scale, would the local increase in temperature have an immediate effect on the life around it? If so how much of an impact?
What about the effect on surface life and life in intermediary layers of the water? After all, a body this size radiating 10's of KW of heat would cause substantial convection. At data-centre scale could it conceivably shut-down ocean currents or re-route them?
Overall an underwater data center should generate far less heat overall than an aboveground one.
It's also possible that a direct increase in ocean temperature has undesirable knock-on effects that don't take place if you operate on land.
I think on a smaller scale test with an artificial pool would provide some solid answers. We could even put solar panels across the surface to limit wind and solar evaporation effects
"I think on a smaller scale test with an artificial pool would provide some solid answers. We could even put solar panels across the surface to limit wind and solar evaporation effects "
If it's artificial pool you'd need evaporation to cool it.
Please correct me if I'm wrong.
Thermodynamics should really be emphasized in schools. You are getting some amazing responses that are completely ignoring the fact that our renewable energy solutions are not increasing the overall heat in our planet. They are just moving energy around.
The only thing that really matters is how are the emissions going to look like when we are manufacturing renewable energy equipment (either new capacity or replacing faulty ones).
They do. And so does every single thing we build that's exposed to the outside - bulding, cars, even you when you are out and about.
The effect is minuscule unless we are turning the planet into Coruscant or this is a gray goo scenario.
The effect of a human on the planet is miniscule, but we're still in the bad state we are.
We're looking at covering the planet in solar panels, worth considering how the albedo changes will effect things on a planetary scale.
Oceans warming up is a big thing, and local effects can be substantial even if global average change is negligible.
More importantly, being "free in terms of CO2" is still an all-else-being-equal perspective. It's focusing on one aspect (CO2 emissions) of one small cog (a CO2 extraction+storage plant). If we look more broadly, each barrel extracted is offsetting less than one barrel being burned elsewhere (since nothing is 100% efficient). CO2 extraction and sequestration is thus a form of power transmission: the work that is required to offset emissions (e.g. from a car) is being performed away from where the emissions are made (although for flue capture this might be quite close!). For example, we can think of these as being roughly equivalent:
- A fossil fuel car with solar-powered carbon capture and storage onboard
- A fossil fuel car with solar-powered carbon capture and storage in some other location
- A solar-charged battery-electric car (+ a little CCS to offset manufacturing emmissions, etc.)
These are all solar powered and carbon-neutral (as long as they offset enough). Let's say they each receive a similar amount of solar energy: the first will not get very far, since offsetting is very energy intensive and it needs more fuel to carry the solar+CCS equipment. The second is more efficient, since the fuel doesn't need to move the solar+CCS equipment; it's as if the offboard CCS is transmitting a little extra power to the car. The third will get much further, since the battery and electric motor make much better use of the solar power than the CCS system.
The first approach is clearly silly. The second is useful in situations where renewables can't be used directly (e.g. jumbo jet fuel), but is incredibly wasteful and expensive compared to the third. The third approach is best, and should be used as much as possible.
If somewhere has an abundance of renewable power (e.g. geothermal in Iceland), then "transmitting" it elsewhere via CCS is much less efficient than, say, laying a high-voltage DC line; or moving high-energy, location-agnostic activities to the region like aluminium smelting or datacenters.
Doesn’t matter is strong. It won’t in the short term. But as we continue increasing our energy use as a species, the simple thermal problem of waste-heat management will certainly surface.
Of course, heating water locally, etc, can cause its own environmental impacts.
Hydro, wind and waves are probably at that ideal except to the extent that they are tidal energy.
Solar panels... are literally in the business of making the planetary albedo higher, to the extent that they do so they are introducing thermal energy.
Geothermal is in the business of increasing the rate at which heat escapes from underneath the surface, which increases surface temperature.
Tidal energy is in the business of extracting energy from the kinetic energy of the moon, which probably increases the temperature of earth (but it's hard to say to what degree).
Fusion (if it ever becomes practical, and you count is as renewable) is in the business of releasing potential energy trapped in hydrogen atoms, increasing the temperature. This is particularly problematic because fusion would also enable us to increase our energy usage to the point that direct heating becomes a problem at the same scale as CO2 release currently is.
Fission (if you count it) is like fusion.
Space based solar (if it ever becomes practical), is increasing the area of the sun captured instead of the albedo, and is directly introducing energy.
If you think that's silly (and rightfully so), then perhaps that can sharpen your intuition on how insignificant is the total amount of heat produced by our devices (even if cumulatively they are a big looking number); the total amount of radiation that comes from the sun down to earth is staggering.
IR might be reflected or absorbed.
The visible component is absorbed. Whether it would have turned into heat or reflected back in to space is about a 1 in two chance.
Growing a 100 trees and chopping them into lumber is less hot than growing 100 trees and burning them.
If all the sun's energy were converted to heat (and not radiated away), we'd be in big trouble. That's what "carbon" pollution is all about -- Carbon dioxide is a greenhouse gas that traps heat. Reducing albedo is one way to increase tempterature, but directly burning stuff is another way.
The point was, coarsely: using e.g. solar panels only changes the Earth's surface temperature to the extent it changes albedo. (Ignoring second-order effects of concentrating heat and associated effects on radiation, etc.)
The 84,000 ppm for 60 minutes is roughly the lethal CO2 concentration. Local CO2 concentration is often several times atmospheric CO2 levels. That’s clearly addressable but I suspect around 8,000 ppm atmospheric we would start to see deaths from this which is achievable from coal deposits. Reaching a fully lethal atmosphere is of course much harder.
So, I think you’re right temperature pollution at extreme levels is worse.
I don't know how this would compare between underwater datacenter, solar panels, wind.
Where do you get that idea? A typical EER 12 air conditioner will move 3.5x the heat energy that it consumes (COP = 3.5) .
At the location of the hvac it puts 130% of the source heat into the environment.
But 30% of that heat put into the environment came from electricity generated in a power plant. Power plants are typically less than 50% efficient, so it put's out as much heat into the environment at the source of the electricity. Bumping the value to 160% (130% + 30%).
However waste heat is a small fraction of the heating that electricity generation produces. Very roughly 10 times as much heat is trapped via the CO2 released than heat is released by the power plant. Bumping that value up to 460% (160% + 30% * 10).
I.e. 4.6 units of heat are put into the environment for every unit of heat removed from a closed system.
(Obviously the details of this depend dramatically on the environment. Heat pump efficiency depends on the degree of temperature gradient, CO2 release and power plant efficiency depends dramatically on where the power is coming from, which changes with where you are located.)
Also I think you're a bit pessimistic about modern power plant efficiency-- combined cycle plants do better than 50%, and that's before we're considering any benefit from renewables.
Fair point, I guess my argument makes more sense if we were discussing moving naturally occurring heat out (i.e. household ac) than with respect to cooling a datacenter.
Nitpicking the numbers used in the estimate... is probably not worth it. Every bit of it is a very rough order of magnitude number. If you're somewhere with 95% renewable energy it should be an order of magnitude better, if you're somewhere where energy production is dominated by an inefficient coal plant it should be an order of magnitude worse.
Wouldnt solar / Wind have a smaller CO2 foot print and hydro electric be more efficient ?
But without scientific sources, these are only wild speculations.
Consider, if you can achieve a fully passive cooling solution by dropping a datacenter into a lake, you've reduced the energy consumption in service of cooling by 100%.
(In reality, water cooling isn't "free," but I'm willing to bet the amount of energy required to dump heat into surrounding water is a whole lot less than the amount of energy spent for the compression cycles and forced air of above-ground HVAC systems. Water cooling using direct application of chilled water is already a thing, using lakes or retention ponds as places to dump heat; what being at the bottom of a lake gives you is a more consistent and proximate source of cool water than you might expect from a current chilled water distribution system)
Obviously this can be mitigated if you are able to get renewable power from a nearby source like a geothermal plant, hydroelectric, or solar. But if you are using fossil fuel power from a long distance away, that means any unit of heat moved by an HVAC involved many units of heat production to ultimately move that unit of heat.
P.S. am not HVAC wizard.
The name of the game here is energy efficiency and conservation. Use less power by reducing power distribution loss at scale. Want an even greener solution? Make em nuclear powered like a giant submarine- that way the power generation isn't creating a heavg ecological impact.
I think your posing of these questions genuinely fail to appreciate this for what it is, a successful proof of concept that will permit a step toward a greener future.
> what is the direct ecological impact of doing this at scale
What is the impact on sea creatures if we put these giant heated cylinders in their territory? Is there any impact? I really doubt we have the answers yet.
> at scale
...also implies that there will be more than one. So if one heat cylinder doesn't do anything, what about 10? 100? 1000? 10,000? The cylinder pictured is really quite small, you'd need to make a ton of them or drop one mega-cylinder to compete with land based centers. And it's not like our current data centers are keeping up with computing demand; we are building more today and presumably will continue building them decades into the future.
So these are extremely important questions. We shouldn't dismiss this as a workable idea but we should also keep in mind that fucking up the ocean is a far more consequential action than fucking up some acres of land.
You're missing half the saying. Creatures adapt, or die. Given the recent articles about how species extermination is accelerating, it's a valid assumption that as many are dying as are adapting.
Not to mention, oceanic currents are propelled by heat. Changing that balance (whether by man-made resources or natural sources) could change the currents, which would impact coastal weather patterns.
This is adding the ocean as a new layer in the overall heat dissipation stack for datacenter computing; it's impacting a whole new ecosystem that it wasn't before. Thus, we need to be asking these kinds of questions about how this will impact oceanic ecosystems.
Then a study (ideally multiple studies) should be commissioned to find this out. We shouldn't just do it because we can, and because armchair physicists are pretty sure that it's insignificant.
Because, that's what Microsoft (and their partner) did. They did it because they could.
Remember when we, as a global society, believed that CFCs would have an insignificant impact on the environment? I do. Perhaps we should tread a bit more lightly when we're already running into species extinction and global climate change issues.
In which case, we'll just end up talking past each other, so I'll wish you a good day.
Asking the questions here on HN instead of in a place where the designers are likely to read them makes it seem it's more of a karma grab than a reasonable "I have concerns" type situation that he really wants to do something about. Nor are random HN commenters very likely to have big, fact-backed contributions btw. It's just alarmism under the pretense of innocently asking questions.
I have no assumptions either way, and if they do happen to frequent HN the chances are actually better to get such questions answered here than anywhere else.
Karma grab? FWIW I offered to hand back in all my karma points because they are utterly meaningless but Dang wouldn't have it so please spare me the nonsense accusations.
HN was still free to write to, contribute and ask questions of last I checked, I don't need you - nor anybody else - to tell me what I can or can not do here, nor do I need you to try to put me in a negative light for trying to understand something better.
FWIW humanity has an extremely well developed skill called problem solving. We can do just about anything in the laboratory. But when scaling up those laboratory experiments we often find out that what we thought was a neg positive ends up being a net negative. Before we sink a few 10's of thousands of data centers onto the continental shelf I'd like to know the ecological impact, even if that has already been studied (which I'm actually not aware of).
See also: plastic, freon, lead (in gasoline) and a whole raft of other things that seemed like a great idea at the time but for which we did not have the long term predictions when they mattered most: at the beginning, mostly because people did not ask the right questions.
Scientists in the beginning of the previous century: "Plastics, they last for ever! yay!" and a hundred years later "Plastics, they last for ever! Oops!".
Anyway, the steady stream of quality answers in this thread proves you more wrong than I ever will but this comment reflects poorly on HN, me, and ultimately, on you.
That said, the physics are "heat is heat". If you put it into the air or you put it directly into the ocean the only way it leaves the planet is by black body radiation. As a result locally heating some seawater nearby has (on a global scale) the same impact as heating the air the same amount.
Now we know there are some ecosystems in the ocean that prefer thermal vents and you might find that around the data center itself you have a wider variety of sea life than is found in the general vicinity due to different thermal conditions. Not sure if you could map out that was a positive or negative change.
Generally though, the ecological impact of doing this at scale is not going to be different in scale than land data centers.
I expect AWS us-east-1 now adds up to over a gigawatt of critical load so if submerged in the ocean off Virginia what would be the effects, and how would those be different from terrestrial datacenters?
Given the mechanics, the overall ratio of heat in the atmosphere vs heat in the oceans is fixed by the Rtheta of the atmosphere/ocean boundary. If the atmosphere gets warmer, more heat is transferred. If the atmosphere is colder less heat is transferred.
The other question (which Jacques alluded too) is what about the local conditions. And here to the thermal mechanics give our underwater data center an advantage. Given the thermal conductivity of water, and seawater in particular, heat dumped into any spot rapidly diffuses to the rest of the ocean. That is not the case with air, which has a much lower thermal conductivity. Dumping lots of heat into unconstrained air locally can cause a localized "hot spot" which creates an interesting thermal plume and localized winds as cooler air around it comes rushing in.
To get a good understanding of just how effective the ocean is at diffusing heat, consider any of the hydrogen bomb tests in the Pacific. Prodigious amounts of heat dumped into the ocean creating a local hot spot (and a lot of steam!) and an undetectable change in overall ocean temperature. Kilauea volcano, same effect.
The ocean has a lot of thermal mass, without something like an asteroid from space, its hard to move the needle on its temperature overall.
 There are some pretty interesting thoughts around using harmonic resonators to convert ambient heat into IR radiation at a wavelength that can more easily pass through the atmosphere but those are just lab experiments at the moment AFAIK.
Thank you, it is precisely that which I was wondering about.
But, if you're heating the oceans, you're putting an entire other mass (the atmosphere) between yourself and space. So you're heating the water, the water is heating the atmosphere, and the atmosphere is radiating that into space. It's adding a step to the overall cooling process.
This works through emitting mostly infrared frequency light for which, just like visible light, the atmosphere is mostly transparent.
This is a very rough calculation and there's obviously nuances but the point is oceans are HUGE and water has a high specific heat. It's much easier to indirectly heat them with greenhouses gases.
Edit: I clearly need to brush up on my physics. Regardless the effect is still miniscule.
The specific heat of water is 4182 Joules/kg and its density is close to 1kg/litre, so 4.128 kJ (Energy) will be required to heat 1 litre of water by 1 degree C.
The temperature rise of the oceans will be complicated to work out. Taking the OPs figures, 4.16x10^14 watts / 1.3x10^21 litres gives 3.2x10^-7 watts per litre of ocean, ie 0.32 microJoules of energy added to the oceans every second. The temperature rise of the oceans will depend on how quickly it can dissipate this heat. What are relevant the heat loss mechanisms? Evaporation just moves the problem to the atmosphere. Conduction just moves it elsewhere on earth. Radiation will shift some of it to space (and some reflected back to the earth), but radiation is a property of the surface of the ocean, not of the bulk.
Amount of energy consumed from global data centers: 205 terawatt-hours. 
Amount of water in the oceans: 1,386,000,000 (km3)~= 1.386*10^21kg
water has a specific heat capacity of: 4,200 J/kg°C 
Energy/HeatCapacity/AmountOfWater = 1.268×10^-7 degrees Celsius
You're confusing power and energy here.
Giving OP's number for the volume of water in the ocean, heating the ocean 0.01 K would be 5.4 * 10^22 J. Assuming 365 days in a year, that's 1.7 * 10^15 W, an order of magnitude less than OPs figure for datacenter power-usage.
You mean the figure that OP first increased by 4 orders of magnitude, right?
1 calorie is by definition the energy requires to heat 1 gram of water 1°C. In practice 1ml.
So ~1kcal to heat 1 liter of water 1°C
1 kcal is ~1.163 watt-hours.
That would mean going from 8.4 million data centers to 840 billion data centers. I mean, come on...
But hey, let's roll with it. What would be the effect on the environment if we literally had 10,000 times more data centers ABOVE ground, for comparison? After all, you are neglecting to take into account the fact that a HUGE amount of the wattage used by existing data centers has been cooling via HVAC, and by switching to underwater cooling their overall watt usage would theoretically drop a good bit due to the efficiency gains.
Also assuming we have coated the oceans with a perfect thermal insulator. Otherwise this heat would also radiate away, a small fraction going to the atmosphere and a lot of it back into space.
Watt is a measure of power not energy. Did you mean Watt-hour? The specific heat of water is about 4.2 kJ/kg/K. One joule is one watt for one second, that is 1 J = 1 W.s. So one W for one hour (3600 seconds) is 3.6 kJ and is enough to raise the temperature of a kg of water by about 0.9 K
You need to brush up on dimensional analysis :-)
Unfortunately, the living parts of oceans are also the parts that tend to be easy to get to, and putting stuff farther away will be more expensive. But careful placement may be able to mitigate that.
It is still ultimately a 3D environment where the vast, vast bulk of the 3D environment is not thriving with heat-sensitive life. Our 2D surface intuition misleads us here. Our mental images of the ocean are of the exceptional locations, not the common ones.
Not to discredit your point or anything, it's a very good one. I think we just also have to answer it in conjunction with what we are currently doing. Maybe we can sink some data centers and have little to no impact while also realizing energy and cost savings?
I agree that switching to wind and solar would be better, but it's not obvious whether we can switch without transforming the world economy in ways people seem unwilling to do. Nuclear reactors can generate much more energy, and I really think it’s time to get behind them.
Radiation release is contained to within the largest stratum available on the Earth: ocean and rock. Anything avoiding our thin atmosphere is a win.
My personal opinion is that if we discover a more devastating weapon of some sort, such as kinetic bombardment using bunches of O'Neill space colonies as projectiles, nuclear becomes comparatively benign and that kinds of event can lead to more de-regulation.
We used to say that about plastics in the ocean. And landfills. And light pollution. And noise pollution. And space junk. And on and on and on.
Heck, not that many decades ago there were responsible people who thought that air pollution was no big deal because there's plenty of air, and it's just fine if California allows rich car collectors to keep buying leaded gasoline long after other states outlawed it. The impact is "minimal."
I used this calculator https://www.omnicalculator.com/physics/specific-heat
The worry is not that that the average temperature of the oceans will rise. The worry is that the local temperature of the ocean will rise. This is exactly what happens with other forms of thermal pollution. One example studied for decades was California's San Onofre nuclear power generating station. It has been subject to regulations regarding its thermal pollution into the ocean. They built long pipes so that the temperature increase from the cooling water could heat the ocean gradually over a wide area rather than severely in a narrow area. It's the same principle as a CPU heatsink.
For what it's worth, I personally found your comment quite distasteful. Instead of trying to understand what you didn't know, you knocked down a straw man and pretended like it was obvious and backed by science.
But that wasn't really the option now, was it?
If 64 bits isn't enough, the next logical step is 128 bits.
That's enough to survive Moore's Law until I'm dead, and after that,
it's not my problem. But it does raise the question: what are the
theoretical limits to storage capacity?
Although we'd all like Moore's Law to continue forever, quantum
mechanics imposes some fundamental limits on the computation rate
and information capacity of any physical device. In particular,
it has been shown that 1 kilogram of matter confined to 1 liter
of space can perform at most 1051 operations per second
on at most 1031 bits of information [see Seth Lloyd,
"Ultimate physical limits to computation." Nature 406, 1047-1054
(2000)]. A fully-populated 128-bit storage pool would contain
2128 blocks = 2137 bytes = 2140
bits; therefore the minimum mass required to hold the bits would be
(2140 bits) / (1031 bits/kg) = 136 billion
That's a lot of gear.
To operate at the 1031 bits/kg limit, however, the
entire mass of the computer must be in the form of pure energy.
By E=mc2, the rest energy of 136 billion kg is
1.2x1028 J. The mass of the oceans is about 1.4x1021 kg. It takes about 4,000 J to raise the
temperature of 1 kg of water by 1 degree Celcius, and thus about
400,000 J to heat 1 kg of water from freezing to boiling.
The latent heat of vaporization adds another 2 million J/kg.
Thus the energy required to boil the oceans is about
2.4x106 J/kg \* 1.4x1021 kg =
3.4x1027 J. Thus, fully populating a 128-bit
storage pool would, literally, require more energy than boiling
Supposedly, the recent - apparent - changes in UK climate are due to changes in trade winds through global climate change. There might be small perturbations that we make that inadvertently make for large changes in major wind/current systems?
I'd guess someone has tried to model these things.
A really huge datacenter might draw 500MW, but that's only the same thermal output as about 7 airliners.
You could submerge every datacenter on earth and no fish is going to notice.
Great question. Without real data it's hard to know for sure. I do think that it would be somewhat negligible though.
Considering we have heat vents and underwater volcanoes in the ocean that kick out insane amounts of heat, I can't see datacenters having a ton of impact. Will it affect the immediate vicinity, probably. Will it affect the ocean at large, I doubt it. Unless we start sinking exaflops of CPU power into the ocean, I wouldn't worry too much.
I'm surprised msft wouldn't propose to pipe the ocean water in like these power plants before going through the challenge of building under the ocean...
It's kind of literally a drop in the ocean, but I assume it'd be worth doing a bit of research about local disruptions if we intend to drop large deployments in shallow waters.
To shut down or significantly disturb ocean currents, we'd need a lot of these things.
So what if we make a breakthrough with fusion power? We will all die from heat exhaustion caused by the arse end of AC units bringing about the heat death of the Earth.