Hacker News new | past | comments | ask | show | jobs | submit login
Microsoft has sunk a data centre in the sea to investigate energy efficiency (bbc.com)
512 points by Qworg on June 6, 2018 | hide | past | favorite | 376 comments



> Additionally because there are no people, we can take all the oxygen and most of the water vapour out of the atmosphere which reduces corrosion, which is a significant problem in data centres

First time I hear corrosion is a problem in datacentres. I never noticed any corrosion on any machine at home or work. Does it have to do with the cooling of datacentres?


I think part of it is that even problems that are small/unnoticeable at the individual level become big when multiplied by a huge N.

For example, I don't think I've ever had a hard drive fail in ~30 years in computing, well without being dropped that is, yet look at the Backblaze reports:

https://www.backblaze.com/blog/hard-drive-benchmark-stats-20...


That's just unbelievable. To even survive the Deskstar era blows my mind. Not a single drive?


That's not all that surprising to me, I've had supremely good fortune with drives. I have a Maxtor Fireball that still "works" (though its SMART has failed.) My current workstation still has drives in it from the three prior workstations; spanning back 7 or 8 years now[1], humming along well outside their warrantied operation. The first disk failure(s) I witnessed came while I worked at a computer repair shop, which would be "a huge N" compared to the number of PCs I've personally owned. My first disk failure on my own personal equipment didn't happen until I built out a home storage array - again a rather large multiplier of number of disks compared to a traditional PC. Even then that failure was because a disk controller got fried, the drive was mechanically sound and the data was completely recoverable.

Admittedly my own experience is probably a bit skewed since I don't have a personal laptop. By the time I got a job that provided me w/ a laptop solid state drives were already quite mature. In my experience doing repair work: laptops were a huge source of our failed drive & data recovery operations.

[1]: I should note these drives all operate nearly 24/7, no five-nines on my personal equipment though ;-P


Argh trigger alert, I bought four of those in a raid 10.

They even had the audacity to reject my warranty claim.


They sent me back refurbs that didn't work, I just gave up.


Up until a couple years ago I could have said the same. I even have a couple deathstars on the shelf that were operational when retired.

In the last couple years I've had several HDDs fail starting with a 2TB Seagate (one of the "stars" in the Backblaze stats.) I've had others fail since, mostly multi-TB drives. I had one nearly ten year old 200GB drive begin to report remapped sectors. I've even had an SSD develop an unrecoverable error.


My dad managed to kill a 2 year old Samsung SSD; it wouldn't even show up on the bios. Tech support was so incredulous they actually had an Electrical Engineer on their staff call him.

I didn't believe him either. I was like "Dad, SSDs are designed to fail in a way you can recover data. What SATA cable are you using? Is the port dead?"

Turns out it was literally the drive.


I recall hearing of similar failures early on. I don't recall the brand. Apparently the controller goes bad and it just stops responding to SATA commands.

Intel had a strategy that caused them to go into read only mode once they reached their rated lifetime write capacity. All data remained available until they were power cycled whereupon they would intentionally brick themselves.

In my case it is was a Crucial M4. It still tries to operate but will fail the long SMART self test.


Much like a poster above, I've got a walking graveyard of previous platter drives in my workstation from various old workstations and they're all fine, but (and I think mine was a Samsung too), my first experiment with an SSD ended in tears. It was fine for similarly about two years, then I had some weird troubles booting about 3 or 4 times, then it just disappeared completely.

I've been very wary of SSDs since then.


I have. Including things like the heads falling off and spinning around, grinding everything to junk.


That sickening click sound that a disk drive makes when it's on the verge of failure is ingrained in my head.

I tried putting one of those in the freezer once, and maybe it was a folk remedy, but it worked enough to boot and get my data off!


That must be one of the silliest hacks I've ever heard, right up there with oven-baking electronics to fix loose solder joints. I love it!


As a teen I managed to 'rescue' 1000's of £/$'s worth of graphics cards from eBay listed as spares/repair with this method. I recall this working about half the time; then again at least 3/10 cards from eBay listed as broken worked immediately when plugged in.


Did you have any "signals" you looked for on the listings to indicate they might just need an oven reflow, or did you just buy everything cheap and hope it worked?


Reminds me of the old Xbox 360 "red ring of death" fix where you could just wrap the box in towels and leave it running for an hour to overheat it, then reboot and it would work again.


Very recently I was able to get a bootlooping N5X to temporarily boot, long enough to recover pictures, by putting it in to the freezer for a couple of hours. Some folks have (claimed to have) had long term success from freezing the bootlooping devices.


Not OP, but I have had the freezer trick work more times than not. It's a last case type of deal, but when there's nothing to lose try it. Just make sure the drive is sealed up nicely.


Worked in HS for a local computer repair shop. One trick to improve success with the freezer method is to place the drive in a sealed plastic bag with some desiccant packs. Placing a bare drive in the freezer can kill it due to condensation. Leave the drive in the bag at least a few hours to give the desiccant time to work. Like others have said, this is a last ditch effort, and if you really care enough, don't do it if you are willing to send it to a specialist, since it can ruin a recoverable drive. But it does seem to work ~30% of the time (at least w/ deathstars) and I did have success on a clicking Seagate drive last year.


It works because of how hard drives work - the heads are extended by slightly heating the bar the heads are connected to. If you're head crashing, you need a little less length, hence freezer.


This isn't quite true, the heads are kept from crashing by air pressure and a crash does irreversible damage. The "clicking" sound is either the heads flicking back and forth as the drive tries to seek, or the central motor trying and failing to start platters spinning which are jammed on the bearing.

What freezing the drive may achieve is either un-sticking it from the bearings (see also the "bang drive on edge" technique), or lowering the thermal noise floor in the electronics enough for marginal components to make it through the boot sequence.

It's not recommended. https://www.gillware.com/blog/data-recovery/hard-drive-freez...


When I worked for a small computer place years and years ago we'd sometimes "tap" a drive as it was spinning up if freezing it didn't work. Tapping it with a screw driver handle rarely worked, but it was always a last ditch effort to save having to send a drive out -- which most people wouldn't pay for anyway.


I did that with a drive in my computer fairly recently. Although I didn't so much tap it as shake it on startup. It did finally start spinning enough to insure that I had the last few bits of data transferred to a new drive, though. :)


While they may make legitimate points in the article, it's hard to consider someone that makes money on HDD recovery like the authors of the article having an unbiased viewpoint here.


Ah, interesting! Thanks for the correction. I was off the project before we put in the drives obviously. ;)


There was a type of Seagate drive long ago (I believe mine was a 2GB) that would stop working, apparently because the spindle got stuck. But if you shook it in a rotational manner, it would unstick! The first time I sent it for warranty service and of course they deleted all my data. But the next time it happened I learned about the trick.


I can confirm the freezer trick worked too. This was back before multi-platter high density drives though. I tried it recently with a failed 6TB drives from a friend, and it didn't work.


Yeah this was many years ago. I think the one I did it with was a 250GB from the mid to late 2000's


It definitely works. I just did it a couple of years ago.

In fact, no matter how long I left it in the freezer, I couldn't get all the data off. I ended up freezing it in the freezer, then putting it in a small cooler with a couple ice packs, with the wires hanging out between the cooler and lid. That gave me just enough to grab all the data (well, really, a couple of not-so-small VHDX files).

It was much fun, like a science experiment in high school.


You may be interested in this interesting archive of hdd-failing-sounds. Note: as it's about 10 years old, it requires flash. http://datacent.com/hard_drive_sounds.php


Well, you're exactly part of a huge N of people that appear on the internet, who combined over the years owned an even larger number of drives.


Even managing just a few hundred nodes, I see hard drive failures pretty routinely. Servers hammering on storage devices 24x7 are much different from desktop computers. I've never had a hard drive fail on my own desktop machine, though I know people who have.


That backblaze report is absolutely fascinating! I'm curious though what the statistics would be from a more consumer point of view, with regards to quantity of use, operational temperature and so forth. I imagine most consumer drives sit idle almost all of the time, whereas theirs are more likely in constant use.

That said, that paints a pretty handy real world picture of what one might expect.


I killed a hard drive when I swapped motherboards about 15 years ago. I suspect static electricity.


My guess is that you change disks often. I've had multiple fails over the years, either from overheating or more often from overuse for extended periods of time (aka more than five years). Although I've never had a failure before the guarantee expired.


Yesterday I lost 3TB Toshiba TD01* series drive 23,75 months in (home) use. Only 4800 hours powered on. And it started to seek fail, even SMART showed it. It failed one week before warranty ended. Ha!


I've seen a whole batch of hard drives fail: one-by-one the company's employees' PCs failed over a period of a week or two. They had all been bought together in a single upgrade.

I (on a Mac) was unaffected (and no doubt a little smug).


The only hard drive I've broken was because a let a pretty powerful magnet fall on the computer, right on the spot the HDD is. Never play with magnets next to your computer.


My N is just 10, yet I've had two failures :(


A trend in energy-efficient data centers is to maintain temperatures and humidity at near the maximum acceptable levels to reduce energy use for cooling and dehumidification.

Data centers used to be like walk-in refrigerators. Now the air is borderline uncomfortably warm and slightly "heavy" with humidity.


Excatly. I know one which only does free-cooling (they only have big fans; that's it).

Failure rate is a bit higher, but they are confident that their architecture is HA/redundant and of what they told me, it's cheaper to replace hardware (they got some DELLs "fresh-air" servers) than to cool the DC.


Sounds like underground in the desert would be suitable?


The industrial scale cooling used in datacentres can produce weird results, like when it rained inside Facebook's first datacentre: https://www.theregister.co.uk/2013/06/08/facebook_cloud_vers...


Sounds like tehy didnt know about this building [1]. That happened like 50ish years ago

1: https://roadtrippers.com/stories/boeing-factory


I wondered about this as well : https://www.researchgate.net/profile/Michael_Schappert/publi...

Seems like its a combination of cooling data centres (flowing air) and the quality of the air being circulated. So if there are pollutants in the air they pose a danger to the circuitry.

Never occurred to me this was yet another thing to be wary of when running a data centre!


I had a PC I built and moved to a bungalow near the beach. Within the year it's power supply had rusted. My last PC in a relatively arid environment lasted 4 years.

It's probably the "sea" environment that would pose corrosion issues.


Power supplies are usually plated or coated recycled "chinesium" steel. This is because it's incredibly cheap and has a half decent EMC outcome for the price. One little nick in the coating or plating and that's it though. Really they could use other materials such as cast aluminium, brass or copper but they are orders of magnitude more expensive.

If you look at some of the older electronic test equipment which was made with much higher standard materials it's not uncommon to find something that has been in a damp shed for three decades and powers up just fine after the dead spiders have been removed and the mould cleaned off. BUT at the time of manufacture they cost more than a mid-range car.


In an interview on Radio Scotland, the project lead said the container is sealed with an inert gas replacing the air and with a significantly lower water vapour content than the atmosphere.


There is a stark difference depending on which sea you are located at. I used to live close to the Atlantic ocean, near the equator. The failure rate of electronic components was through the roof. Random memory failures (due to contaminants in memory contacts) as well as corrosion in copper traces in the PCBs. At least one instance of motherboard fire.

I'm now close to the Pacific. No issues whatsoever.


Yes. The air quality -- especially the amount of sulfur in the air -- contributes to sulfur creep corrosion. Run that through an image search to see the results. It's pretty dramatic. Some of this equipment looks like it had been underwater.

But while the older equipment was pretty stable (so long as the capacitors didn't bulge), today's equipment usually has to comply with ROHS. Lead, for all its faults, has a very well-understood corrosion mechanic. The exotic blends used to replace it, we're discovering, aren't always so great...

You know, now that I think about it, I wonder how many cell phone warranty claims have been denied for water ingress were actually due to poor solder choices? Color-changing stickers aside, of course.


Here's an interesting piece:

http://computer.expressbpd.com/news/ctrls-shows-how-data-cen...

tldr: gaseous contaminants corrode silver/copper components of circuitry


I think it's mainly a problem in highly polluted urban areas because of sulphur dioxide in the air. Most datacenters use enough air conditioning for moisture to be a non-issue.


Some AWS techs in Oregon from a few years ago have a fun story for you.


Maybe it is related to mitigation of sea water facilitated corrosion?


I think it has to do with the humidity of the air.


At what point does this become a problem? They're using the naturally cool sea water to regulate the temperature in the container, but that heat needs to go somewhere... it heats the water. If this becomes a popular alternative to classic server farms, at what point does it start to increase the temperature of the sea? I know, the oceans are big, but so is our data.


> I know, the oceans are big, but so is our data.

Meh. The biggest data center (ChinaTel's Inner Mongolia Information Park) consumes about 150MWe, assuming all of it becomes waste heat it's basically nothing: a nuclear reactor (each plant has 2~8) releases 2Wt for each We it produces, and they're usually sized between 800 and 1300MWe.

Hell, earth averages 160W/m^2 from the sun, oceans cover 360m km^2, so oceans get 10^16 watts from the sun.

Not to say that it can't have a significant local effect, nuclear plants are strongly regulated to avoid heating their rivers too much (especially in warm summers), and again we're dealing with orders of magnitude more heat dumping heat waste in a very finite (though moving) amount of water.


Even Greenpeace at least doesn't think that it's a bad idea:

"Experimental underwater data centres could be more sustainable if connected to offshore wind power, but Microsoft must focus more on investing in new renewable energy now. [...]"

https://www.bbc.com/news/technology-35472189


My understanding is that MS took that to heart on this second round. The article claimed they choose Orkney as the site of the new datacenter precisely because they have an excess of renewable energy.


That's kind of a clever idea to promote renewable energy.


Coupling an undersea datacenter with an offshore windfarm just sounds all-around clever.


Until a hurricane destroys it.


Build all above-surface structures out of inflatables. During heavy weather deflate and retract (5-10m). Use simple solid-state generators, e.g. https://en.wikipedia.org/wiki/Vaneless_ion_wind_generator


Are other structures safer against hurricanes?


An underground data center would work if hurricane safety is your top concern. In any case a location on land is probably cheaper to rebuild.


I think an eventual hurricane would not destroy the data center, only just the wind turbines.


What about earthquakes then?


>Meh. The biggest data center (ChinaTel's Inner Mongolia Information Park) consumes about 150MWe, assuming all of it becomes waste heat it's basically nothing: a nuclear reactor (each plant has 2~8) releases 2Wt for each We it produces, and they're usually sized between 800 and 1300MWe.

Seems a shame to let all that heat go to waste. Surely we put that waste heat to use and a distillery or desalinization plant or something.


Desalination isn't a bad idea, but it does produce large quantities of waste brine, among other things: https://en.wikipedia.org/wiki/Desalination#Environmental


>but it does produce large quantities of waste brine, among other things

So combination data center, desalinization plant, and pickle factory then?


To let all that radioactive heat go to waste? Although maybe that's just the primary coolant circuit - is the water they use from rivers/the ocean made radioactive, or just heated?


I was thinking more from data centers. You could probably do it with nuclear too.

This gif: https://www.ucsusa.org/sites/default/files/images/2015/08/np... shows how typical nuclear reactors work. They're functionally just steam turbines, the nuclear power is just capable of heating a whole lot of water.

The waste heat, though, doesn't come in contact with the radioactive parts. That's what's happened on the bottom with the condenser.


You say "assuming all of it becomes waste heat", but doesn't all of the energy consumed necessarily become waste heat no matter what?


In addition, the heat represents extra energy. Warmth and water (and carbon) are the very fabric of life. Nature will adapt to utilize this free energy source before you could say "thank you".

My prediction is we'll be amazed at the life forms that develop and explode around such submerged cooling structures.


> Warmth and water (and carbon) are the very fabric of life. Nature will adapt to utilize this free energy source before you could say "thank you"

Ocean warming is extremely detrimental to ecosystems across the globe. You can't just simplify it to more heat == more energy == better

>In other non-tropical regions of the world, marine heat waves have been responsible for widespread loss of habitat-forming species, their distribution and the structure of their communities. This has a tremendous economic impact on seafood industries that have to deal with the decline of essential fishery species.

>It is likely that over the past century, the impacts on ecological chains have been more frequent as ocean temperatures have escalated decade after decade. This is the case in Nova Scotia, where kelp forests are literally being wiped-out by water which is much warmer than usual. In this corner of Canada, the ocean is not just a form of recreation, it also means a way of life for many that rely on fisheries and aquaculture as an important part of their economy.[1]

[1]https://www.theweathernetwork.com/news/articles/ocean-heat-w...


> In addition, the heat represents extra energy. Warmth and water (and carbon) are the very fabric of life.

That's cute, but 1. nature will adapt just fine to both heat and cold so that's not exactly compelling; 2. the issue is we may not, human civilisations have arisen in fairly specific conditions, and tend not to be very resilient to significant environmental changes

> My prediction is we'll be amazed at the life forms that develop and explode around such submerged cooling structures.

My prediction is we won't live long enough to see that happen, it takes kilo- to mega-years for anything more complex than bacterial mats to evolve to use new sources of anything.


> nature will adapt just fine to both heat and cold

Not with the same ease! That's the point — it is a one-way street. "Having easy access to energy" or "not having easy access to energy" are NOT equivalent states for flourishment. They're not equally "just fine".

The rest seems like you're grinding some anthropomorphic axe unrelated to my post, so I'll abstain.


> "Having easy access to energy"

Life needs an energy gradient. In this case, direct access to colder water. No organism (or machine) can use the heat energy of its environment if it has no access to a colder medium.

Edit: I just saw that Retric explained it better (https://news.ycombinator.com/item?id=17245948).


Where is this confusion coming from? Are you suggesting there won't be colder water around the data centre? Or just nitpicking for the sake of it?

It looks like my original comment hit some HN ideological hot spot (unintentionally), but it's entirely uncontroversial scientifically.


> Where is this confusion coming from? Are you suggesting there won't be colder water around the data centre?

If a fish is swimming in 24°C water, it can't simultaneously be swimming in 19°C water. Maybe its friend 10 meters away is swimming in 19°C water, but that doesn't help the first fish.

Maybe that fish feels more comfortable in 24°C water, because it needs a certain body temperature to keep its internal processes running (i.e., to not freeze to death), but it cannot harvest energy from the 24°C water, which is what you claimed above. I'm not nitpicking, this is one of the most fundamental and important laws of physics.


I don't know whether this is feasible at the temperature gradients created by data centers, but it's not prohibited by physics. If there are chemical species in the water that are stable at 24 C and not 19 C, the fish can harvest these for chemical energy after they have traveled the ten meters.

This is probably more relevant at temperature gradients greater than 5C, but it's thermodynamically possible.


Nature, defined as the absence of human meddling, will be fine. Especially after humanity went extinct.

In this context, we're not worried about the literal definition of natural but about keeping an environment in a state in which humanity can survive.


Heat is not really useful energy the way your taking about.

https://en.m.wikipedia.org/wiki/Second_law_of_thermodynamics

Life wants entropy like sunlight or glucose becase it can do something with it. Heat can speed up chemical reactions which makes a minimal amount useful indirectly but not as an energy source.

PS: Heat gradients are a form of entropy and for example create thermals which are then useful.


You're correct of course, but same difference here. The extra energy will kickstart processes that nurture life.

I was actually pondering whether to expand on "energy", "water", "carbon" and "life" in the original post (none of them trivial concepts) but decided against it. It'd only muddy the waters, so to speak, missing the point:

A submerged data centre will be a net boon to the biological life around it.


> The extra energy will kickstart processes that nurture life.

Why would that be the case? If this were the case, the biological density of the highest-temperature locations in the ocean should be significantly higher than average. This has not been observed. Quite the opposite, actually.

Secondly, in thermodynamic terms, heat is the least useful form of energy. Heat is often the waste product of a chemical/mechanical process and cannot be easily be converted into other forms of energy without significant loss.

And no offense, but water and carbon are both trivial well defined concepts. Energy is also fairly well defined.


Local pockets of increased heat have not been observed to nurture life? Thermals mentioned by OP are just one obvious example.

I agree making use of subtle energy gradients is not trivial, but life is pretty good at it nevertheless. Even in conditions you wouldn't expect it. And no surprise — it had billions of years to evolve that way.

If you wanted to be daring, you could even say that's what life is for.


> Local pockets of increased heat have not been observed to nurture life?

That's absolutely not what the comment you're replying to says. What it says is:

> If this were the case, the biological density of the highest-temperature locations in the ocean should be significantly higher than average.

> Thermals mentioned by OP are just one obvious example.

Thermals are not just heat, and by and large the heat is not a source of energy (sulphur chemistry is the basal energy source of thermals). And shallow waters have much higher biological densities.

Ambient heat is only useful so far as helping the organism improve the efficiency of its chemical and biological reactions, it's extremely rare for it to be an actual energy source (because as you've been told multiple times it's extremely hard to use/harvest). And organisms are generally adapted to a certain level of ambient heat with compensatory mechanisms matching, most don't do very well if you drastically change their ambient heat levels, again aside from micro-organisms with short lifecycle which can adapt extremely quickly.


Listen, my claim is simple: submerged data centres will give rise to richer, more complex ecosystems around them (due to the extra energy, heat convection, increased entropy etc), in a surprisingly short timeframe. Because that's what life's good at.

You disagree, giving reasons I find irrelevant here (a data centre won't make a dent in the average ocean temperature, and certainly won't make it "the highest-temperature location in the ocean"), but I respect that. The good news is that the impact will be easy to evaluate once deployed.

In fact, testing the data centre's impact on the surrounding ecosystem will surely be a mandatory component of any such project, so we'll get to see the hard data. Let evidence be the judge of the "absolutely nots".


Ask yourself why you're on this site.

Is it to steadfastly argue positions far outside your domain expertise or engage in discourse and learn?


Well you're making it sound as if the data centers being cooled somehow wouldn't happen otherwise. Seeing as how the alternative is to cool using A/C, surely it's the case of causing less harm than the alternative?

Of course you could argue that the lower cost of this cooling method will create a larger demand for cheaper servers/data storage, which increases net harm, for which I have no answer.


> Of course you could argue that the lower cost of this cooling method will create a larger demand for cheaper servers/data storage, which increases net harm, for which I have no answer.

That argument doensn't apply here: demand isn't currently being constrained by us being affraid of the environmental impact of datacenters.


Sure it does. If the costs of these datacenter operations were extremely high (say, by a factor of 100), you bet we'd have lower bitrate for Youtube, fewer GB allowances on Office 365, Dropbox, Gmail, etc.

Conversely, if it became literally free to operate, you can imagine we'd have a higher demand for data, storage, etc.

It's like how center pivot irrigation reduced the water usage rate per acre but increased demand for water usage in various areas because costs were lowered as well.


edit: nevermind I can't read


You don't believe demand isn't being constrained in the way I described? So you believe that people are looking at cats on facebook, and stop because they're worried about the environmental impact of facebook's datacenters? You're joking...


FWIW, your local FB datacenter is likely powered by 100% renewable energy :)


There'll also be a lot less energy consumed in the cooling process - A/C is probably grossly inefficient compared to moving water around.


The overall heat increase is the same, but the localized effects mights not be: fewer beings live in the atmosphere, and air displaces that heat faster than water, reducing the local concentration.


>The overall heat increase is the same

That's not the case if you're comparing with AC. You have to use energy (generating more waste heat) to pump heat up a temperature gradient.


Wait, is that true? Fewer organisms in air versus water? Is that due to the huge amount of e.g. zooplankton maybe? Curious...


>which increases net harm, for which I have no answer.

Find a desert under the sea and drop them all there. Some place that's basically water and dirt. It's probably going to be large enough that you could dump all of humanity's data centers there for the next 100 years and still have room to spare.


If my calculation is correct, the total annual electricity production in the world (I used 20 279 PWh) would heat up the oceans (1 347 000 000 km³ of water) by 0,0129°C.

It's rather easy to calculate because one calorie is the energy required to heat up one gramme of water by one degree Celsius, the rest is just unit conversions, but I could have messed up anywhere of course.

Even if I'm off by a few orders of magnitude I'm confident that datacenters won't be noticeably heating up the sea for a long time.


> Even if I'm off by a few orders of magnitude

I'm not saying there is a problem with your calculations, but if you were just one order of magnitude off, then after 10 years that would be 1°C, which is a colossal amount. The main problem with global warming, as I understand it, is that 1 or 2 degrees change to the atmosphere would melt a huge amount of polar ice. I imagine that if the sea were increase in temperature by that amount, given that water has better thermal conductivity than air and that's kinda where the polar ice mostly is, the effects would be at least as bad (?)

Edit: given that some of the heat would dissipate into the atmosphere and sea bed, maybe it would need to be more than one order of magnitude higher to have this effect.


> I'm not saying there is a problem with your calculations, but if you were just one order of magnitude off, then after 10 years that would be 1°C, which is a colossal amount.

That is assuming that all the electricity produced on Earth is used to heat the oceans, which is not realistic. Oceanic datacenters are not likely to ever amount to more than a few percent (and even that is unlikely) of the total human electric consumption.

For reference, apparently datacenters used 416 TWh in 2015, which is 0,002% of the total electricity usage.

That's why I'm confident that there is a lot of margin in my calculation.


> all the electricity produced on Earth

Oops, I misread your comment, sorry. I thought you were talking about all electricity used in data centres worldwide.


> then after 10 years that would be 1°C,

or 1000 years if off in the other direction


You can't assume that the ocean is perfectly mixed. For example, desalination plants dump highly saline water into the ocean that can have adverse effects on local ecosystems, even though their effect on the entire ocean may be negligible.


According to https://en.wikipedia.org/wiki/List_of_countries_by_electrici... , total electricity production is about 24816400 GWh/year (~25 PWh/year, so you were 3 orders of magnitude off).

  $ units -t '24816400 GWh / (1347000000 km^3) / (1 kg/dm^3) / (4200 J/kg/K)'
  1.5908368e-05 K


Units is a wonderful program; don't be afraid to use their definitions (/usr/share/units/definitions.units):

  $ units -t '24816400 GWh / (1347000000 km^3) / waterdensity / water_specificheat'
  1.5851925e-05 K
There's even definitions like `oceanarea`, etc ;-)


And if there's 7.62 gigapersons on earth, that's about 3.2MWh/year per person or 271 kWh/month per person.

Since that's the same order of magnitude as, say, a US household, that does seem credible.


So that is to say, all the worlds energy production directed toward heating the oceans, could increase global oceanic temperature by 0.00001 degree C in a year.


Looking again, I think I got caught off by the "." being used as a decimal separator in some places.


One thing to note is that heat diffuses much slower through the ocean than the atmosphere - even if the impact on the average temperature of the ocean is negligible, local thermal pollution could still cause major issues. I don't know how much effect a pod of this size will have; hopefully Microsoft will fund and publish a comprehensive review.


I understand that this is just a back-of-the-napkin calculation, but I feel like the heating will have a more noticeable impact on the local environment (think 5km³), rather than considering all of our planet's oceans all at once.


> I understand that this is just a back-of-the-napkin calculation, but I feel like the heating will have a more noticeable impact on the local environment (think 5km³), rather than considering all of our planet's oceans all at once.

You're not putting all of the planet's datacenters within 5km^3 either. The calculation is fine.


You are indeed off by 1000 (assuming your inputs are correct)


I'm not surprised! Which way though?


Lower. You probably mixed kilograms and grams.


Now you can calculate by how much the water level would rise and how many square feet land would be flooded...


however will these values change depending upon the depth and hence more pressure?


>(I used 20 279 PWh) would heat up the oceans (1 347 000 000 km³ of water)

Your calculation is stupid because you dont need to heat up whole ocean, heating up small areas it will be butterfly effect, enough to badly affect bigger areas. Look at this:

>As the concentration of carbon dioxide in the atmosphere rises due to emissions of fossil fuels, more of the gas is dissolving in the ocean. That is lowering the pH of the water, or making it more acidic, which can make it harder for reef-building organisms to construct their hard skeletons.

Minor change in CO2 changed pH of water, which kills organism in wider area.

https://news.nationalgeographic.com/2016/03/160321-coral-ble...

There are also closed seas like Baltic that need over 100 years to fully mix sea water with ocean water, it's much less salty than other seas and warmer, also much more polluted from toxins that were sinked there during and after WW2.


>Your calculation is stupid because

No need to be rude. "There is an issue with your calculation because" would have worked :/


Not a subsea engineer, but having worked in the space previously I can tell you that we already have giant man made heaters sitting at the bottom of the ocean. Subsea Oil and Gas wells take advantage of the steady temperature at those depths to dump excess heat form the production fluid (sometimes up to 120F for straight crude, much higher for oil and gas mix) whilst and before it rises up to the platform. Sometimes using active or passive cooling on the ocean floor.

Then there's natural geothermal vents doing that, but probably an order of magnitude more, since before humans were around.

The ocean is a pretty big thermal mass.


This will never become a problem.

Take the mediterranean for example, a relatively small ocean. It has a volume of 3.75e15m3, with a mass of 3.75e18kg.

We need 4kJ/kg/K to heat up water. To heat the whole mediterranean by 1K, we need an energy of 1.5e19kJ, or 4.16e6 TWh.

In 2008, total worldwide primary energy consumption was 1.32e5 TWh, 31x less energy than needed to heat the mediterranean by 1K.


It cools back down when the water evaporates. But it doesn't do the oceans any good anyhow because thea heat also fuels algae blooms.

I doubt it will become popular because it involves waterproofing the enclosures, complicates maintenance staff access, and also carries the risk of water getting in and damaging the equipment.

There are other alternatives to AC. Using a cooling tower is a much better alternative to submerging the whole datacenter.


This is mostly aimed at large cloud providers such as Amazon, Google, and Microsoft. They can use units like this and take them up for scheduled maintenance every x years. What happens to some of the individual servers between maintenance is of less importance. The infrastructure can handle the failures and they can just turn off the affected servers if they cannot fulfill any role.


> Microsoft's Ben Cutler insists the warming effect will be minimal - "the water just metres downstream would get a few thousandths of a degree warmer at most" - and the overall environmental impact of the Orkney data centre will be positive.


I was wondering the same thing.

>> "Microsoft's Ben Cutler insists the warming effect will be minimal - "the water just metres downstream would get a few thousandths of a degree warmer at most" - and the overall environmental impact of the Orkney data centre will be positive."

It's not like we've heard that before about carbon dioxide emissions and other environmental pollution. Companies have mostly a different interests than the protection of our environment, from past experience.

I'm very interested to see what scientists and researchers think of this.


Wouldn't it be much smarter to recycle that heat energy? "Our goal is to recover and reuse all the heat produced." - https://www.telia.fi/yrityksille/english/telia-helsinki-data...


We're already heating the sea (and everything else) by running the air conditioners to cool the datacentres, and generating the electricity to power the air conditioners that cool the datacentres.

This is more a lot more efficient.

Perhaps there might be a localised effect, but I doubt the effects would be as severe as the carbon emissions saved.


All the big power plants use rivers for cooling. It doesn't seem to be a terribly big problem unless there is a drought.


Thermal pollution can cause significant ecological impact [1] when left unchecked; however, thermal discharge is regulated under the Clean Water Act [2], hence the usage of cooling ponds [3] and towers [4].

[1] https://en.m.wikipedia.org/wiki/Thermal_pollution

[2] https://www3.epa.gov/region1/npdes/merrimackstation/pdfs/ar/...

[3] https://en.m.wikipedia.org/wiki/Cooling_pond

[4] https://en.m.wikipedia.org/wiki/Cooling_tower


Or summer is a bit hotter than anticipated, in which case fish are free to go ahead and die.


I'm guessing it is less than a drop in the ocean given that the ocean makes up 3/4 of earth. Also don't forget that the earth core is hot as hell. I don't have exact number but my guess will be much less than a drop in the ocean if all datacenters were pushing heat to the ocean.


> At what point does this become a problem? They're using the naturally cool sea water to regulate the temperature in the container, but that heat needs to go somewhere... it heats the water.

Makes as much sense as the people who think that windfarms will eventually stop the wind, or people who worry that smoking is contributing to global warming.


How much of a problem than all the lava flowing into the ocean from the Hawaiian volcano?


Time for outerspace data centers with ultrafast space elevator strato-cable connections.


Counterintuitively, cooling is really difficult in outer space. It's cold, but there's no air or water circulating to remove the waste heat by convection. Instead you have to rely on radiating the heat away, which is much less efficient.


I have the impression "cold" is kind of ambiguous in space, because while there are very few particles, the ones that are there are moving quickly, which could be seen as a high temperature.

See: https://en.wikipedia.org/wiki/Thermosphere


Never thought of that. Nice point!


If the heat becomes enough, couldn’t we recycle it?


No, because you can not convert heat alone into energy. You need a heat difference (something hot and something cold). And the efficiency of the conversion depends on the temperature difference.


You could use it to heat something up that you actually want to be warm.


Right, but if you’re putting it in a place that is naturally cold, you’re going to have a temperature differential you can then work with to recycle the heat, wouldn’t you? Obviously you’re going to be losing heat still, but you could reclaim some of that energy.


Yes, in theory you could do that, but the efficiency depends on the temperature difference. The higher a temperature difference you have, the better. The theoretical maximum efficiency of a heat engine is (Tmax-Tmin)/Tmax (temperature in Kelvin). So a few degrees is not going to result in anything usable. So to reclaim the energy, we need a large temperature difference.

But since the goal is to cool the datacenter, we want the temperature difference to be as small as possible.

Reclaiming energy from waste heat usually only pays off in industrial settings. If you have very hot steam, you can use it to power a steam turbine and generate electricity. But at lower temperatures, the efficiency is too low.



> Prof Ian Bitterlin, a data centre consultant for nearly 30 years, is sceptical about the environmental impact of going underwater.

> "You just end up with a warmer sea and bigger fish," he says.

> And 90% of Europe's data centres are in big cities such as London and Madrid because that is where they are needed.

> Microsoft's Ben Cutler insists the warming effect will be minimal - "the water just metres downstream would get a few thousandths of a degree warmer at most" - and the overall environmental impact of the Orkney data centre will be positive.

Both of these opinions feel like exaggerations but I don't know enough about thermodynamics to know what the true answer is. I have a feeling that the Microsoft opinion is much closer to the truth, can anyone help me understand how I'd walk through the numbers?


Not an expert either, but this might be a good way to think about it: on land or at sea, the machines produce the same amount of heat. On land, in general, the environment around the machines will be much warmer than the desired temp. Cooling on land is in general an active affair, which has to do with generating and controlling flow of fluid and/or maybe some kind of thermic reaction. Potentially at sea being plunged in cold water is sufficient to reach the desired temp, and cooling becomes a passive affair. All in all you are trying to displace the same amount of heat but through a more efficient medium. Correct me if I'm wrong.


Oh, I'm sure that being in a cold environment and letting it passively cool you is more efficient!

What I'm more unclear on is how small the impact is. It seems very likely to me that throwing the waste heat into the bottom of the ocean has less of an environmental impact than running A/C to suck out the heat and dump it into the air.

However, is the impact really so small that you wouldn't notice it just meters away? That's the part I'm unclear on.


I think so yes. Water has many times greater thermal conductivity than air, so the heat will spread out across a larger area faster. Meaning a very small (negligable) temperature increase at any given location. You wouldn't be able to detect it from a few metres away.


I'm trying to see how the efficiency is gained and I just don't really see it. I guess you don't need to pump water for water cooling so you save the pumping energy, or maybe lower fan speeds or something, but it really seems like it would be a minimal amount. What it does seem like is a cheap way to build a water cooling system that doesn't really need to be maintained. It seems like switching to arm would be a much bigger efficiency gain, but nobody cares because energy is cheap.


Switching to ARM might be a bigger efficiency gain, but nobody wants arm. Azure already offers ARM servers. Moreover, these aren't exclusive - one can easily sink an ARM server just as easily as you can sink an Intel one.

But the efficiency gains are such: Air conditioning is actually inefficient, especially with air cooling. Water cooling is more efficient. Water cooling traditionally uses water as an exchange medium, but is eventually water-cooled-by-air anyway, just in bigger batches. Here, they can take in new cold water and throw out old hot water without actually bothering with any air exchange at all. Or at least, that's the plan. Maybe it'll work!

(If "nobody cares because energy is cheap", we wouldn't have tried this in the first place.)


Usually DCs use AC for cooling, which needs electric power, which needs far away power plants and tranmisions lines. Here, you have a practically endless source of cool water that you can just pump directly.


> It seems like switching to arm would be a much bigger efficiency gain

That's a false dichotomy. More energy-efficient CPUs don't preclude efficiencies from cooling.

I suppose the case could be made that CPUs account for the vast majority of datacenter cooling needs and that ARM efficiencies would eliminate so much of that need that any efficiences in the cooling itself would not be worth anywhere near this kind of cost. I'd expect some pretty extraordinary evidence backing up that argument, since those would be pretty extraordinary claims.


I don't understand why not locating it in a city in a cold climate and using the waste heat to heat the city buildings?

There is precedent. In Seattle there's the old "steam plant" which piped steam to many local buildings to heat them in winter.


This may be more common than is often realized. Also in Seattle, we (Amazon) use heat from the Westin colo/datacenter to heat some of our office buildings ... https://www.greenmatters.com/news/2017/11/22/Z2tmzwQ/amazon-...

About ten years ago, before working at Amazon, my last job involved building a datacenter in Leiden, the Netherlands. There the city has a municipal heat exchange program and we could also vent the excess heat to be used for heating water.

Modern data centers, especially for Cloud services, are really really really big though ... so big that they have specialized real estate and power requirements. The locations where you can get that much power, and that much space, tend to be outer sub-urban or quite remote. In those locations, there are few consumers for excess heat so more effort goes into reclaiming the energy loss through other means.


You're talking about District Heating[1], and that's been around for decades in hundreds of cities. It's the source of the steaming manhole covers that are a fixture of noir imagery.

Denver's got the oldest continually operating system in the world, and within the last decade or so, they added a cooling loop, as well. Instead of a boiler and a cooling tower, you can subscribe to a steam loop and a chiller loop.

The problem is once again of gradient. It makes thermodynamic sense to pipe steam around, because of the large gradient between steam and ambient. But servers don't make steam. At best they make warm water only a couple of dozen degrees over ambient... and warm water doesn't have enough energy to heat buildings very effectively.

That said, and as someone already mentioned, people are doing it anyway. Seattle's internet exchange pipes its water across the street to the Amazon towers[2].

BTW, Seattle's Georgetown steam plant might not be making steam any more, but the one down by the market is still operating as Emwave Seattle[3].

[1] https://en.wikipedia.org/wiki/District_heating

[2] https://blog.aboutamazon.com/sustainability/the-super-effici...

[3] https://en.wikipedia.org/wiki/Seattle_Steam_Company


> At best they make warm water only a couple of dozen degrees over ambient

Consider that geothermal heating systems are based on the ground having a temperature of 55 degrees. The difference between that and cooler ambient is used to drastically reduce heating bills.

For example, if it is 30 degrees outside, that is heated to 55 by the earth, then the building heater only has to boost the 55 to 70 rather than 30 to 70.


City heat requires building out quite a bit of infrastructure, then in summer you still have to get rid of it without pumping it through the buildings. Even Inverness doesn't really need heat in summer.


Hi Walter.

Also.

I'm assuming it has to do with distribution and reliability within regions, hence why you even need datacenters at different locations and not just one location.


I wonder why they are submerging these instead of just making them a floating barge.

I can't imagine submersion is significantly better for cooling, the energy density very likely already requires a circulating water system, and the added issues dealing with the pressure at depth seem risky.

Possibly there is a benefit not related to cooling. Perhaps weather/waves? Things are probably alot calmer 30ft below the surface. The tossing and turning on the surface may place additional streses on things like HDD spidles.


HDD spindles????


Spinning hard drives have a decent gryoscopic effect. Hold one while it's powered up and try rotating it in a manner that would change the plane of the platters and you will feel the resistance.

Each time you do that, it puts a slight load on the spindle bearings. Do it repeatedly on a varying 5-30 second cycle and you'll simulate what a harddrive on a boat or barge in the open water would experience.

I can imagine that would create additional wear and tear and contribute to an increased failure rate.


Right but who is deciding to use spinning hard drives instead of SSDs in an underwater data center?


Cooling fan spindles are subject to the same forces; and they had a large array of cooling fans.


I doubt they're going to use fans if they are pumping the air out of the thing.


Here's some more information from the blog post: https://news.microsoft.com/features/under-the-sea-microsoft-...

And here's the project site: http://natick.research.microsoft.com/


Wow. That is super interesting. Thanks for sharing.


I wish somebody worked on using that unwanted heat, like they do in swimming pool / ice rink combos, rather than finding ways to just disperse it faster in the environment.


It's tough in hot climates. There's plenty of heat to go around. And there are lots of people who live in hot climates who, for latency reasons, need servers near them.

Thus there are kind of two cases to solve for. You can stick a data center in Northern Europe, and that heat might be valuable enough that your approach could be to try to reclaim it. If you stick a data center in Singapore, you'd better focus on generating less heat in the first place or finding better ways to get rid of it.


Even in Singapore the max sea temperature is 32C, which seems cool enough to cool a data-center.


Or perhaps heating a greenhouse in a cold climate where fresh food is expensive or impossible to import.


Arthur C. Clarke wrote a short story about an power generator based on the temperature difference of a long pipe reaching into the cold ocean depths. I think the story was The Deep Range.


Toronto cools some of the larger office buildings downtown by pulling in cold water from the bottom of Lake Ontario.

http://www.acciona.ca/projects/construction/port-and-hydraul...


my local swimming pool is heated by waste-heat of a nearby powerplant


I wonder if the water would be hot enough for a thermal bath/spa?


Some non-environmental reasons for wanting to do this:

http://www.alexwg.org/publications/PhysRevE_82-056104.pdf

> Optimal intermediate trading node locations for all pairs of 52 major securities exchanges, calculated using Eq. 9 as midpoints weighted by turnover velocity... While some nodes are in regions with dense fiber-optic networks, many others are in the ocean or other sparsely connected regions.


I don't think this is done for the sake of efficiency, but rather latency.

For example, if you had a underwater data center that sat in the middle of the atlantic ocean in between new york and london, you could do some serious trading with that capability.


No, you could not. You would gain nothing over having a datacentre in London and a datacentre in New York and a low latency link between them.

You can note, for example, that there is no data centre half way between Chicago and New York, even though that area has cheap accessible real estate and billions of dollars have been spent on low latency communication links between the two.



For those confused by this like I am, the key part to look at is the "emulation" paragraph.


Even in the coast of large cities, like NYC or Boston. The real state cost of having a data center in those regions are high, but it would be much cheaper to just hide your servers underwater, near where your users are.


This is a big reason. The cost of leasing out space is definitely more expensive than the cost of cooling these datacenters.

Why spend several million dollars in leasing space when you can drop a datacenter capsule off the coast with free cooling? Who cares about the cost of hardware.


The cost of connecting the thing via submarine cable to a suitable land point is likely to be a lot higher than you think.


Do you have a source to back this statement?


So you’re going to put servers in a public waterway and you don’t think the US Government is going to have a word with you about it?

You’re going to pay even if it doesn’t have a street address.


However, Orkney isn't in the middle of anywhere, it's some distance even from Telehouse London.


Closer to Norway than London!

Of course, Orkney was part of Norway before becoming part of Scotland in 1468.


TIL. Those pesky Vikings.

My cousin, from Indiana, is currently serving as a minister on Stronsay. The scenic photos he's sharing are certainly encouraging me to contemplate a visit.


My great grandfather was skipper of the Evangeline who died along with all the crew off of Stronsay in 1909:

https://www.youtube.com/watch?time_continue=27&v=0UVWUbJp3Ys

http://www.axfordsabode.org.uk/disaster/evangeli.pdf

Edit: (Can't work out if was great great grandfather or not! - but certainly some ancestor!)

Edit2: I'd definitely recommend visiting Orkney - it regularly gets evaluated as one of the nicest bits of the UK to live, the scenery is fantastic and it has some of the most amazing historical artifacts in Europe.


Thank you, that document makes for compelling reading.


I wonder what will be algae/flora impact on a heat exchange after year or two. This thing will be warm and stationary so it will became overgrown really fast.


I imagine the first step would be to use standard marine anti-foul paint on the exchangers. But that only works for a few months at best.

After that, I'm not so sure. Perhaps a slow, continuous movement of an exchange surface past a hard surface would do the trick. When this stuff is young, it's easily wiped away, but once it's there for a while you get real problems.


We could look at hydrothermal vents to see the effect of heat source underwater.


The heat from hydrothermal vents on its own wouldn't support ecosystems in deep water, they also feed out various useful chemicals and nutrients. The base of these foodchains is creatures that can build biological carbon stuff using chemical energy from that instead of using light, called "chemosynthesis".

https://en.wikipedia.org/wiki/Chemosynthesis


Well everyone is talking about the energy usage, which is the point of the experiment.

But what happens after 5 years (the expected life of the datacenter). Most of the computers in there will be worthless then. Will they bring up the datacenter and reload it with new equipment? Will the cost-benefit be in favor of re-equipment or just sink a new one in? If its going to be cheaper to just drop a new one in, we will have ocean floor littered with dead datacenters.

Haven't heard anything about this.


Even if the equipment is outdated and broken, the scrap might be worth something to someone.


I assume the shell and racks will still be worthwhile in 5 years


If you want maximum energy efficiency, build city centers in cold climates where the excess heat can be used. Yandex has data center in city of Mäntsälä, Finland and they provide 50 percent of the heating needs.

Even more advanced concept, combined district heating and cooling. With combined district heating and cooling and integrate data centers into them. Data center heat can be used in cold regions for heating. Combined heat pump/chiller units can produce both heat and cooling at the same time. Sea water (trough absorption chillers) is used for cooling apartments, offices and data centers. Heat from data centers, purified waste water, etc. is used for heating apartments. During winter more heat is utilized. During summers more water cooling is utilize.

https://www.theguardian.com/environment/2010/jul/20/helsinki...

Suvilahti Data Centre New Build Case Study: Operationalizing District Heating and Cooling in large Data Centre in Former Electricity Station http://www.energy-cities.eu/IMG/pdf/WS2_Helsinki.pdf

https://www.helen.fi/en/company/energy/energy-production/pow...

COMBINED DISTRICT HEATING AND COOLING https://www.euroheat.org/wp-content/uploads/2016/04/Case-stu...

District Heating & Cooling in Helsinki http://www.iea.org/media/workshops/2013/chp/markoriipinen.pd...


It's all very clever, but think about the cost and complexity. Installations, management, real estate, physical security etc.

I'm quite confident that even making optimal use of the excess heat you would end up with less money than with their solution. I don't think companies want maximum energy efficiency. They want maximum cost efficiency.


Exactly, cooling is expensive but not as much as human resources. With a sinkable data center, you could put it next to any coastal or riverside city.


The technology is already proven and saving money.

> physical security etc.

Physical security is not a problem. It's not like the city utilities manage the cooling of the data center. There are heat exchangers between. If there is problem with city utilities, the heated sea water goes back to the sea.


So flooding a server starts to mean something completelly different


What would be its ecological impact? What if there are more of these data centers?


Is there a benefit over just putting it in a boat/coastal building and using water (perhaps pumped from some depth) for cooling?


Boats in that part of the world are subject to severe stresses because of the weather. And boats are always moving, even in harbour. Just maintaining a reliable cable connection to a boat is more challenging than connecting to a subsea container. Placing the container on the sea floor avoids almost all of the weather related problems.


Microwave links? How about an oil derrick style thing mounted directly to the seabed so it doesn't move?

There's also plenty of coastline away from major cities. Find enough open space and it becomes feasibly multi-tenant. Add wind/solar and run a cable (or 3).


The closest thing I remember is Google taking over an old paper mill in Finland that used a condenser sunk into a canal connected to the Baltic sea.

https://www.wired.com/2012/01/google-finland/ there should be a video with the details here.


There's also a data center in Helsinki, Finland that is cooled by sea water and the waste heat produced is piped via a heat pump into the district heating network. Here's a press release from 2011: http://www.investineu.com/content/atos-builds-world%E2%80%99...


This, it feels like an awful waste to dump the heat straight into the ocean. District heating networks are a very efficient way to heat up a city, and they can be used as a place to dump excess heat. Further, we've now also got district cooling, http://basrec.net/wp-content/uploads/2014/05/District%20Cool... [PDF warning]


Putting in on/in a boat or near the coast might get you the seawater for cooling, but the project lists a few other big benefits like oxygen-free atmosphere, constant temperatures year-round, and ease of deployment.


A hermetically sealed capsule can be placed on a boat just as easily as a submarine, if not easier due to the pressure difference.

I'll give you the constant temperature but this project is about using the nearly-free ocean to maintain temperature so surely the delta isn't too large here.

I don't see how this is easier than a boat at anchor.


Imagine a future where whenever the energy price drops in a region suddenly companies show up with a bunch of such tubes drop them in the ocean. Do some batch processing and once the energy price rises they pull it out and move on.

In reality of course connectivity would be a major problem and energy price differences are probably not large enough to make it viable.


Although it doesn't have anything to do with computational load, hydroelectric storage exploits the change in electricity price to store energy in hydroelectric systems. At low demand they pump water into the reservoir, and at peak demand they release it again through turbines, generate electricity, and sell it at the higher price.

https://en.wikipedia.org/wiki/Pumped-storage_hydroelectricit...


Powering down some facilities during temporary spikes would make sense (someday if hardware is cheap enough, operating only for peak solar generation/minimum local demand could make sense). Being able to relocate due to cost/legal/etc. over a few month period could make sense, or the special tax and planning/regulatory treatment a "ship" would get vs. a building on land.


Putting your datacenter in international waters, without laws or regulations, hidden from the world. Such a great way to do things without people knowing, or having people do things about it.

Power it with offshore windmills, and you really only need a network connection.

Great idea though, using water to cool your datacenter and leaving all the oxygen out.


Maritime law; there is no such thing as international waters for “stuff” each vessel has to fly under a flag.

The MSFT Datacenter is a vessel under law, if they don’t want to fly it under a flag they’ll have to essentially abandon it and then salvage laws come into play.


In that context, the Internets fascination with pirates makes a whole new level of sense. Servers, classified as marine vessels, admins being literal captains sailing under flags.

In a more fun world, one of those vessels would fly the Jolly Rogers and host a certain infamous Swedish bay.


If your data center isn't registered and flying a flag, won't it be salvage and belong to whoever recovers it?


Yeah but what are the odds of somebody finding it?


Just follow the cables!


Why would the physical location of the server matter with respect to people knowing what you are doing? I have no idear where googles servers are when I fire off a search for cute cats in boxes, yet I still know what it’s doing.

As for “without laws” google still follows European laws even if my search happens to be handled on an American or British machine. You obey the laws where you operate not where your server is located.


For companies who are already complying with laws in whatever jurisdiction I'd have to agree with what you are saying.

But if the economics in the future make sense on some level, I could see criminal enterprises interested in exploring things like this. If narco's are building/using submarines now, I could see them managing/tracking/monitoring a global logistics from their sunken data centers (they'll need to find other ways to power and transmit/receive, because I doubt they could just hook up their cables directly onshore anywhere, setting aside being an easy choke-point to cut access).

Maybe future narcos, illegal EU personal data miners/brokers, pirates or who else for whatever could be interested, esp if the fixed cost of operating stuff now/future > operating/ future sunken data centers and probability/cost of seizure now/future > probability/cost with future sunken datacenters?


Let's hope this takes off and leads to regulating the oceans.

For example: industrial, unregulated fishing destroys coral reefs every day. They are thousands of years old and won't come back.


> It will not be possible to repair the computers if they fail

This is the part that concerns me the most. Pretty much all DCs have multiple 24/7 staff to deal with hardware failures and equipment swaps... telling a client "you can't access your hardware for 5 years" wouldn't go over too well.


Well, you clearly wouldn't lease dedicated hardware with this model. You just redistribute load amongst data pods when there is a failure, and pull the whole pod from service for refurbishment once it passes your total failure threshold.


Think about it in a cloud model where you aren’t renting a server you know by name. Imagine running something like object storage or FaaS where the cloud provider handles everything behind the scenes and can failover at any time without you seeing more than perhaps a few failed requests.

In that model, hardware failing is just a factor in total overhead cost. If the hardware doesn’t fail immediately it might be cheaper to leave a dead node in the rack than to pay a human to touch it, especially if they’ve already recouped a significant percentage of the purchase cost by the time it fails. Over the life of a server the cost of cooling is enough that a substantial savings will push that breakeven point earlier.


> This is a tiny data centre compared with the giant sheds that now store so much of the world's information, just 12 racks of servers but with enough room to store five million movies.

How much is that in Library of Congress units? This trend of not giving out data but appearing to do so is strange.


"The Phase 2 datacenter can house 27.6 petabytes of data."

Assuming you're using 15TB as one LoC unit, then about 1,800 LoCs.


Interesting idea but the maintenance must be painful. Let's say a hard disk fails, what do you do?


It's a fail-in-place data center. If a hard disk fails, you don't do anything. They used to replace telephony equipment all the time too, now it's just a locked room in a building. This is pushing data centers to be more hands-off.


So they don’t do anything? Eventually you’re just maintaining stacks of errored and broken computers.


By then you've replaced the datacenter and you can shift the workload and then salvage it.

In theory you'd only leave it down there for three years anyway before everything in it is worth zero, at least to the IRS.


Yeah. Fail in place over the service lifetime. When enough stuff is failed that it isn't worth keeping down there, you pull it up and refill it.


Based on the picture, "pulling it up" looks like a fairly intensive task, requiring boats, persons, etc.

That kind of thing eats directly into the ROI for a datacenter. I doubt it competes with a static building with a bunch of solar panels on top.


Might just fail-in-place, then. Although that looks pretty bad from an environmentalist perspective.


Article says no maintenance - if it breaks its broken.


Overprovision with cold failover.


As an aside in the article:

> There has been growing concern that the rapid expansion of the data centre industry could mean an explosion in energy use. But Emma Fryer, who represents the sector at Tech UK, says that fear has been exaggerated.

> "What's happened is we've had the benefit of Moore's Law. The continued advances in processing power have made doom-laden predictions look foolish"

There may be other reasons that energy efficiency will continue to improve, but Moore's law, (More specifically in relation to performance per watt: Dennard Scaling), has long been at an end. Given her position it's fairly ignorant to sight this as a reason for a continued lower proportion of energy consumption growth by now.


> In a normal year, demand for electric power in Chelan County grows by perhaps 4 megawatts ­­— enough for around 2,250 homes — as new residents arrive and as businesses start or expand. But since January 2017, as Bitcoin enthusiasts bid up the price of the currency, eager miners have requested a staggering 210 megawatts for mines they want to build in Chelan County. That’s nearly as much as the county and its 73,000 residents were already using.

Source: http://www.thenewstribune.com/news/business/article212008409...


Interesting idea. My hunch tells me that they’re underestimating the corrosive nature of salt water. I also wonder about vibrations, which are much more pronounced underwater.

But I guess we won’t know until we try!


I'd be curious to know the specific benefits of the Orkney Isles to this project; when I hear the renewable research there being referenced (by family there, some of whom are directly involved) they always claim the major benefit is the close proximity of different underwater conditions, with some of the greatest range in tide.

Yet I assume this isn't particularly useful for a datacentre of a well understood shape that doesn't intend to use the conditions to create power through novel designs.


That's true specifically, but orkney has generally become a hub for renewables research, so the snowball effect may be in play here.


Orkney also has an electricity surplus due to an inadequately upgraded connection to the mainland.

There's a proposed upgrade to link to the grid at the nearby Dounreay nuclear power station: https://www.ssepd.co.uk/OrkneyCaithness/


If clusters of these tanks are eventually deployed, it could have some surprising effects on local marine life and might be a good way to jumpstart a marine preserve. [0] [0]https://www.abcactionnews.com/news/local-news/clusters-of-ma...


This opens quite an interesting perspective for 'off-shore' hosting, literally. I remember from a previous gig a while ago that renting servers in Saskatchewan or something similar provided companies with a favourable legislation as well as cheaper server cooling. If you plunge a data centre into the middle of the sea somewhere in international waters, does it really fall under any legislation?


No legislation cuts both ways. Be prepared to fend off pirates and state actors attempting to claim free loot.


I can see a few robotic hands doing magic under the sea while being operated remotely. Submarine drones deliver the hardware. I can see this helping MS to move into the space and a huge boost for robotics. It will get there eventually, I think the no maintenance is just to test if it is worth it before going all in


This can't be a new idea. Or is it that it's only now become possible to do such a thing?


I remember reading google was doing the same experiment some 10 years ago.


Microsoft boiling the oceans is a positive move for environmentalism, hybrid owners agree.


What happens in a large body of water (ocean etc) when an electric shock accidentally slips out or a live electric cable is cut? Is it similar to an underwater grenade going off, affecting a large area, or is it not a worry?


Electricity finds the path of least resistance.

If that power cable would be completely cut, the electricity would spark from one of the pole of the cable to the other, until some breaker shuts the thing off.

Even at great energy, the impact would be minimal.


Wondering how long it will be until we have cloud data centers in space... Seems like it could be useful to offer a flexible computing infrastructure to small sats and science missions without requiring ground comms.


Actually getting rid of excess energy is a huge problem in space since it's pretty empty: https://en.m.wikipedia.org/wiki/External_Active_Thermal_Cont...


Totally. It's the difficulties of putting lots of computing resources in space (heat, power, serviceability, radiation) that actually makes it attractive for selling compute time. Only one company needs to solve those problems and others can leverage that work for a fee. If it was easy, there wouldn't be a business opportunity.


I think we should spend more "energy" on reducing the heat in the datacenter and solve the problem for everyone. Cooling the datacenter in the sea seems like a hacky (though creative) solution.


Laser based cooling systems is where we should put our focus.


At first I read it as "Microsoft has sunk GitHub in the sea...".

Sigh my brain is trained to see MS+GitHub wherever I read a news article about MW or GH as a result of all the hype for past two days.


What's the average lifetime of a data center? I wonder how the enclosure for this underwater one compares, and whether it's comparable.


On a totally non-serious note:

Just reading "Microsoft has sunk a data centre" sounds pretty hilarious given recent events (GDPR, acquiring GitHub)


> There has been growing concern that the rapid expansion of the data centre industry could mean an explosion in energy use. But Emma Fryer, who represents the sector at Tech UK, says that fear has been exaggerated.

If we're concerned about the technology industry vastly increasing energy consumption, lets look at the entire medium sized country's worth of energy bitcoin is consuming for absolutely useless calculations.


Task for some more suited than me: turn transactions into protein representations, and every mining operation becomes a folding operation.



Here's the problem for me. These mining operations can be exploited to an extreme. Any scientist or professor with knowledge in protein representations will have an unfair advantage over normal miners. This gap will be even larger than the hashpower gaps we see between ASICs and "normal" miners in bitcoin, litecoin, etc because scientists will have access to not only hardware but also extremely specific domain knowledge.


Isn't that good though? The scientists who make new discoveries get paid to do so, rather than just some random person with a gpu in their closet?


It's not, because no one would use a coin with such imbalance in mining power. It would be all around more efficient for people to just donate to scientists.


>Any scientist or professor with knowledge in protein representations will have an unfair advantage over normal miners.

I, for one, welcome our new protein folding overlords. May they rule us more justly than the investment bankers they've replaced. :-p


The problem with solving "real" problems instead of synthetic ones is that they aren't uniformly difficult, so clever people will game the system.


As long as gaming the system means solving a real problem, that would be a good thing right?


Not if you want the currency to replace traditional currencies in the future.


Presumably that wouldn't be a goal for such things. You could just as easily call them "coolness points" and use them as badges of distinction. That alone could motivate participation. God knows it works for Reddit.


https://stats.foldingathome.org/teams

Folding@Home already has team stats with points.


This has come up a number of times...probably because of the studies[1] over the last year or so. It's convenient to hit bitcoin on this front because it's conceivable to estimate its energy footprint. What I find frustrating is that we look at no other segment of tech this way. We never discuss the energy consumption of video games or the stock market. How can we say if this is very bad if we have nothing to compare it to? It's quite likely there are far worse offenders.

To be clear, I hold no bitcoin. I'm no fan of it, but I find criticism of its energy consumption disingenuous...especially when there are so many more valid criticisms of it to seize on.

[1] https://www.cell.com/joule/fulltext/S2542-4351(18)30177-6 and others


The assumption is that there is no real benefit to the energy expenditure Bitcoin demands. This isn't true for video games (entertainment is generally considered to be worth a decent amount of resources for the psychological benefits it brings) or the stock market (it's basically the heart of our economy).

If you don't share these assumptions, you'll come to a different conclusion. But I don't think it's hypocrisy.


Exactly, discussing this requires the assumption that bitcoin has little value, instead of directly discussing its value. It may not be hypocrisy, but it's certainly fallacious. It only reinforces existing beliefs in the merit/demerit of bitcoin.


I cannot understand why people (certainly not just you) are so relentlessly obtuse about the destructive effect of Bitcoin mining and how it differs from other currencies.

Say I trade a barrel of oil for $65. That barrel of oil is going to get burnt, (because otherwise it wouldn't have value to the recipient) but the process of burning it can produce something useful, like transportation.

If I burn a barrel of oil to create electricity to mine 0.0084 bitcoin (about $65), I've "traded" the same amount of oil for the same currency value, but I've used the energy up by hashing. Nothing is left over to transport people or make plastic or whatever.

There is a clear difference between a currency where it costs maybe 10 cents to create a token that circulates at $100, and one where it costs $100 which nobody gains to circulate at $100. It doesn't balance out - it is an unambiguous terrible waste. And as I think of it, isn't it really just the same waste as a 100% gold standard?


It continues to bother me that simply by having a problem with this particular argument against bitcoin, people assume I'm pro-bitcoin mining. Isn't it enough to have a problem with fallacious arguments?

Oh I think bitcoin is plenty destructive, but it's a simple fact that we never critique ANY other technology or field in terms of energy consumption this way. We don't break down the industry-at-large in terms of energy consumption, and evaluate whether that's a valuable ROI. We don't do that because doing so is reductive.

Power isn't fungible.

You know what I think is a waste? All the brain power and human resource time spent on solving problems with bitcoin when we could be putting that time toward going to Mars.


"it's a simple fact that we never critique ANY other technology or field in terms of energy consumption this way"

We do, though. I just did. I pointed out that a $100 bill costs about 10 cents to make, whereas mining $100 worth of BTC requires $100 in electricity be used.


In the same vein, I'd be curious how much energy gets consumed by HFT.


I would think far less. HFT isn’t intentionally computationally expensive.


And the HFT industry is incentivized to use as little processor time as possible, and thus as little power as possible.


That's not true.

They'll take a 400W 10GHz CPU over a 5mW 1 GHz CPU every time, if it generates better financial returns.


Parent was referring to latency optimization, I believe.

But I'd assume most of those folks have ML / other slower models constantly churning away in the background, even if their HFT reactions are highly tuned and simpler.


My impression is that to the extent that HFT is about relative speed, competition can't reduce power usage.


And how much by machinery to run advertising?


Not to mention all the extra CPU load the tracking and JS spam causes on everyone's phones and computers.


There's a large overlap between serve farms and energy use backed-by/for marketing.


you're not going to hear the media complaining about that, though


Not very much. A handful of machines per exchange colo, for each of a few hundred orgs? Probably about as much power as a small-midsize college, if I were to guess?


Or focus on cryptocurrencies that are much more scalable and decentralized than Bitcoin and don't use Proof of work as a consensus mechanism, such as

1. Skycoin, Web of Trust 2. Iota, Tangle 3. Aeternity and soon alsk Ethereum with PoS/PoW hybrid


Or maybe start to focus on how cryptocurrencies just aren't working, are a waste of precious energy, and don't currently have real value outside of those that hold it and trade it.


Surely, those humans wasting precious energy, would not waste energy and ressources if they were not sitting at home and staring hypnotized at some netflix movie?


Naturally scarce stores of value like Bitcoin, gold, diamonds, etc. have always "wasted" resources, whether they be power, human resource, or polluting the environment. On one hand you could regard them as immoral because of this, on the other hand they enable the freedom to store value without trust.


At this point, we know that Bitcoin doesn't scale to handle the volume of transactions needed for a global digital currency. Ultimately, if you own Bitcoin in hopes of the value going up, it will only go up if someone is willing to pay more for it. There's no other utilitarian use of it.

This means that Bitcoin is a 0-sum game*, and a tremendous waste of resources. At least with gold mining we end up with a material that has important industrial use.

0-sum game: Meaning, in order for someone to profit with Bitcoin, someone has to take a loss.


I'm not a huge fan of bitcoin and have never owned more than 200 dollars worth, but that's a super weird argument to make.

Of course something only goes up in value if someone is willing to pay more for it, that's true of art, cars, houses, or whatever. You could argue that a house has more utilitarian value, as you could always live there, but that completely ignores the other persons goals. A house isn't going to be much value if jobs are scarce in that area, but bitcoin would still have value in that scenario.

All that energy is going to make sure you can send a large amount of value between two parties from almost anywhere in the globe with a minimum of transaction costs and a decent amount of anonymity.

This also ignores the fact of risk pricing. I live in Backwatermenistan and the political stability and the currency is looking unstable. I could buy a bitcoin from someone else who has one with local currency, I lower my risk of ruin from currency inflation by paying a price that I deem reasonable, the other guy gets cash which he could use to start a business or anything else that's inside his risk tolerance. No one lost in that scenario.


>I could buy a bitcoin from someone else who has one with local currency, I lower my risk of ruin from currency inflation by paying a price that I deem reasonable

You could just use dollars, euros, or yuan instead. Which is what people currently do in that situation.

Or, be like the mobsters and take payment in goods like olive oil, stolen art, or drugs.


Issuing a $100 bill only requires about 10 cents of overhead. Mining $100 in bitcoin requires $100 of overhead because it's inherent to the hashing mechanism. This is a big difference.


> we know Bitcoin doesn’t scale to handle the volume of transactions needed

No longer true. I used to believe this as well, but I’ve been blown away by the work from Lightning Network [1]. Subsecond transactions that happen off the Bitcoin blockchain, scales beautifully.

[1] http://lightning.network


Does that actually exist? It was vaporware for a pretty long time (though admittedly, I haven't checked in more than a year).


It does exist and work. It's early days but you can now use it. Source: I bought a sticker using it. :-)


Yes but its still in beta on the mainnet so its not ready for daily use yet. You can see all the nodes here https://lnmainnet.gaben.win/


>Bitcoin, gold, diamond

You're being dishonest if you're pretending these things are identically comparable.


They're all arbitrary stores of value whos intrinsic value is not justified by the market price?


Only Bitcoin requires the continuous consumption of energy to continue being a useful asset though.


The only reason anyone cares is because it's relatively easy to compute this inefficiency compared to the vast sea of inefficient products that would require signifigant effort to even approximate.


Precisely, I'm sure there's an enormous amount of infrastructure, extraction equipment, storage, processing and reprocessing.

Without the full data such a statement is vacuous.


Bitcoin doesn't intrinsically require that much power. The computation to verify transactions is a tiny fraction of that required to compete for the next block by hashing. The high hash rate is because of the high difficulty, which is itself dependent on the hash rate of the entire network. If the price of bitcoin drops, the incentive to mine decreases, and thus so does the difficulty.

This is comparable to gold, as an increase in the gold price also incentivises increased mining.


But once extracted their energy costs are negligible (especially with gold being stored and the rights traded instead of the physical asset a lot of the time). In the case of BTC you can't really separate out the cost of mining a block from validating the transaction because until it's in a block a transaction doesn't really exist (this will get much better with lightning though). But without the constant burn of mining BTC (even post lightning) will be come useless.


Until it IS in a block. Then, like the storage cost of gold or any "real" material, the energy expenditure for maintenance is negligible. In fact, considering the storage requirements of something like bullion I'd argue the maintenance costs are considerably lower.


Yes any given transaction is valid once it is in a block but without continuous mining it's still worthless because it cannot be spent without a new block being mined. And since mining has to be pretty continuous or the whole thing becomes unusable because of loooong processing times the maintenance of the whole network kind of becomes maintenance of the transactions.

Even lightning would only last for a while without new blocks being mined because adding funds to your 'lightning channel' requires a transaction to be placed on the blockchain to open a channel.


You're assuming gold / physicalvaluestores can be spent without shipping cost and exchange fees and that the volume traded reaches 0 in your comparison. Is the volume of any valuestore ever 0?

Also mining doesn't have to be continuious, it's reward varies with volume to adjust the rate mined so you only wait a looong time if suddenly transaction volume spikes.


> You're assuming gold / physicalvaluestores can be spent without shipping cost and exchange fees and that the volume traded reaches 0 in your comparison. Is the volume of any valuestore ever 0?

That's why gold isn't a currency there are definite transaction costs but the whole of the costs of transporting gold spent isn't required for my gold to be spent. With BTC the whole network supports all transactions and all transactions require the mining network for validation.

> mining doesn't have to be continuious, it's reward varies with volume to adjust the rate mined so you only wait a looong time if suddenly transaction volume spikes.

Mining has to continue as a whole though is what I'm meaning. Because it has not physical value and no ability to transfer without active mining the utility and value of BTC is inherently tied to the act of mining. There's also a hard minimum of active mining where the network remains secure against 51%/double spending attacks.


> That's why gold isn't a currency there are definite transaction costs but the whole of the costs of transporting gold spent isn't required for my gold to be spent.

Whole costs for transporting gold are definitely required to be spent... the gold must be transported from one owner's location to another, unless the escrow between the two parties is the same or you're actually trading ETFs.

> With BTC the whole network supports all transactions and all transactions require the mining network for validation.

This is true for the network as a whole, but not any given transaction. Such a model would mean that the network is topologically fully connected, which would in turn mean that it'd be orders of magnitude slower than it already is.


> That's why gold isn't a currency there are definite transaction costs but the whole of the costs of transporting gold spent isn't required for my gold to be spent.

What I'm trying to say here is that my transaction doesn't depend on the energy required to transport all the gold traded.


Physical gold still needs to be mined, smelted, transported, and guarded. When you buy it and move it around energy is spent too. So if you really want an apples-to-apples comparison you'd have to count the energy needed to mine each oz. of gold.


The most succinct critique of Bitcoin is, in my opinion, that it's rediscovering the drawbacks of a 100% gold standard vs paper money. It's just like a pure abstraction of the waste of digging stuff out of the ground just so we can count things. It's inexpressibly sophomoric.


The large majority of gold traded for bullion or investment isn't actually moved. It's just the rights to claim X tr. oz. of gold in the vault of XYZ company. Also value of my gold doesn't depend on the transport of all the gold out there so rolling those costs in at the individual level doesn't make much sense. BTC on the otherhand is useless without mining because the existence and transfer of BTC depends on the miners. Even post lightning mining is still critical to opening new channels.


The large majority of Bitcoin isn't actually moved every block either. And those systems you describe to reassign rights to gold also use energy. As block rewards fall, profit will come from transaction fees only, and only the most efficient miners (best equipment or energy access) will remain. Sure, Bitcoin uses more energy now than a database sitting on a server, but it's solving a different problem.


It is solving a different problem but that doesn't change the fact that the whole network's mining work is required to make it work. There are some energy costs to tracking the rights of stored gold but they're vanishingly small compared to BTC.

We're getting sucked down into the weeds here and losing sight of the original point which is BTC as designed requires the use of massive amount of energy to validate and secure the transaction ledger and that ledger and the things it enables is the whole value of BTC.


I think gold is flawed in essentially the same way as BTC. Mining BTC is called mining because it's a neat abstraction of the old fashioned process. The reason fiat currency took over historically is because there's a tremendous benefit to eliminating the dead weight of mining if you can evolve a more subtle way of limiting supply.


Right, diamonds just cost human lives.


"Right, diamonds just cost human lives."

If we're going to play that game, where did the electricity come from for your watt-hour-sucking GPU?

A coal plant? How many died mining it? How many got life long illness or injury/disability mining it? How much poverty did the abusive mining industry inflict on poor dependent communities? How many hundreds of thousands of people were affected by it's emissions? How many thousands of cases of asthma? How many deaths?

If we go nuclear : Where was it mined? Who mined it? Who processed it? Where is the waste going? Are you supporting rather despotic countries depending on ore source? Etc

Or the cards themselves. Where did the rare earth materials that go into graphics cards come from? Where were the graphics card parts manufactured, and where was the board assembled? Are you benefiting from near-slave labor for your cards?

I mean, if we want to play that game, there's so much to examine!


I like nuclear, it's the obvious choice, and incredibly safe.

https://en.wikipedia.org/wiki/Uranium_mining_in_the_United_S...

"Uranium mining in the United States produced 3,303,977 pounds (1,498,659 kg) of U3O8 (1271 tonnes of uranium) in 2015"

The US. I wasn't aware there were a significant number of uranium mining deaths? I also assume the US can process it and safely dispose of the waste. No despotic countries of course.

China is a tough one though, I don't know how we justify it.


AFAIK the US imports 80% of its uranium used in power generation?


You didn't really post much about why exactly that's a bad thing.


https://www.eia.gov/energyexplained/index.php?page=nuclear_w...

"Canada–25% Kazakhstan–24% Australia–20% Russia–14% Uzbekistan–4% Malawi, Namibia, Niger, and South Africa–10% Brazil, Bulgaria, China, Czech Republic, Germany, and Ukraine–2%"

So we can toss in the blood diamond / African blood uranium comparison, 10% of American uranium comes from Africa.

But we also have the despotic nation of Russia, so by using uranium you're in some part enriching that anti-liberty mobocracy.


> If we're going to play that game, where did the electricity come from for your watt-hour-sucking GPU?

Pointing out that diamond mining is also not without externalities doesn't mean d0lph is an exponent of Bitcoin mining.


But that isn't an inherent fact of a diamond. Mining them doesn't require that people lose their lives it's a fact of shitty people. And most diamonds consumers will encounter aren't from that stream.


Yep. Human nature is a different problem set altogether from being intentionally designed to waste electricity.


> intentionally designed to waste electricity.

I strongly doubt the intention was to waste electricity, more likely it was incidentally designed that way, as it's difficult to conceive how you could create such a decentralized cryptocurrency without proof-of-work (at least at the time).


None of them have intrinsic value.

At least gold & diamonds are useful


useful + scarce == intrinsic value, no?


Useful is a subjective value judgement, so... no. But I get what you're saying - I'd say it's more correct to say it's valued, but not that it has intrinsic value.

For example - over history gold has been mostly considered pretty, but not exactly high value. It was only through trade it became representative of value (of the goods).


I didn't suggest they were identical. Only that a limited comparison can be made about their use as a store of value.


How can you put gold bar, diamond ring and string of 0/1 in the same sentence in terms of "resources wasted" ?


Admittedly diamonds are not a good comparison, because they're not commonly used for investment and not fungible. Gold bullion is comparable insofar as it's used as a store of value, with scarcity controlled by natural limitations, rather than by policy. I'll admit that gold has some practical uses, such as in electronics, but that is not the primary market driving its trade or mining.

Its use in jewellery I would say is just an extension of its use as a store of value; it is desirable because it is valuable. If we simply wanted a lustrous metal (as may have been an early factor in its adoption as a commonly traded precious metal) we now have many inexpensive ones to choose from -- aluminium, chromium, etc.


Well for starters, if everything in the universe is computable, ie. digital physics, then they are all just configurations of information..


https://www.investopedia.com/terms/s/scarcity.asp

Everything in the world is scarce (non-infinite). Just because something is scarce, does not mean it's valuable.


You can't "store value" without trust. Value is not a thing, it is an implicit contract to exchange things over time, and requires trust that everyone will continue to adhere to the contract, ie that debts will be paid.


You're right, I was wrong to say without trust. I think it would have been more apt to say trust in their integrity. Demand can of course change, but your gold won't be physically redefined by other parties.

I think an interesting difference between cryptocurrencies and precious metals, is that to some extent they can be redefined by consensus of the network, or split consensus (e.g. Etherium and Etherium Classic). So I'll concede that they don't even strictly negate the need for trust in their integrity.


I think the bigger trust problem here is trusting that BTC will continue to have value. You have to trust that a market will continue to exist, there are plenty of cryptocurrencies that are worthless. There is no actual reason bitcoin has any value aside from market consensus, and the same could have been said about tulip bulbs at one point.


You have the same issues with gold also. Although it's very unlikely gold would lose most of its value, the price of gold throughout history has fluctuated greatly:

https://upload.wikimedia.org/wikipedia/commons/e/e3/Gold_pri...

Even for assets with industrial applications, there's a risk that it may be replaced with something else, or made cheap (e.g. synthetic diamond; aluminium was a precious metal before electrolytic production; asteroid mining?).

I agree with your comparison of Bitcoin to tulip bulbs. Just because Bitcoin is used as a store of value doesn't mean it's wise.


I am most concerned about the data cable getting cut by ships. Don't know why nobody else here is concerned about that.


Because most of the internet already goes through under-sea cables? We're pretty good at not cutting under-sea cables.


Would have loved to see FMEA done for this project. From both environment and Data centre point of view.


Lots of energy used in storing the data may not always produce effective results.


And what would the effect on life in Water or environment affect of the same.


I'm sure the environmental impact of this one unit will be negligible, but if the worlds largest tech companies start deploying thousands of these - imaginable, since it's a pretty good idea from an engineering perspective - I could see that causing some real problems.


> I could see that causing some real problems

Can you be more specific? What type of problems and why? Temperature-related, waste related, hazardous substances? And why would this become a problem at-scale?


  Prof Ian Bitterlin, a data centre consultant for nearly 30
  years, is sceptical about the environmental impact of going
  underwater.
  
  "You just end up with a warmer sea and bigger fish," he says.
The reason I think it could become a problem at-scale is, again, because it's a great idea from an engineering perspective. Cheap land, cheap cooling, space-efficient (no need for human amenities in the building, like walking space and bathrooms and security checkpoints). I could imagine a scenario a decade or two down the road when most of the world's data centers are dumped off the coasts, at which point there would likely be impact. Setting aside the heat, there's the widespread disruption of the sea floor. Also, while some companies might take environmental precautions, I'm sure that others would not. Why would you bother spending the money to salvage a pod in the case of a tank breach or a fire?

It's far from the biggest environmental concern we have, and it could be done with minimal impact if those doing it cared enough, but in my experience large companies won't go to any expense that doesn't make or save them money.


Well now we have a plan for boiling the oceans:

Bitcoin mining + underwater data centers


Reminds me of that NOC +10 ARG and the first Bioshock.


What would be its ecological impact?


<sarcasm>Haha, Microsoft, you sunk a data center and now you think you can run Github.</sarcasm>


new snorkelling destination


Brilliant.


data syncing


  # Full fathom six thy server sits;
  # Of its disks are coral made;
  # Those are pearls that were its bits
  # Nothing of it that doth fade
  # But doth suffer sea-migrations
  # Into substrate for crustaceans.
  # Sea-nymphs monitor system vigour:
  # Hark! HEALTH CHECK: FAILURE -- replacement trigger'd


I'm hiding this in my source code somewhere as we speak. Very well written.


If you made a book of these, I would buy it.


This is worthy of being put in fortune(6).


We have been following this since Microsoft launched project Natick in 2015[1]. Comupting has a large effect on our planet, data centers consume over 6% of the world's electricity (more than India) and generate over 4% of the world's CO2 emmissions (twice that of commercial air-travel)[2]. The approach taken by Microsoft is a nice PR-stunt but we think it's impractical/difficult to carry out maintenance and leads to a very expensive TCO.

FWIW we believe immersion cooling is the most practical approach to reduce energy needs of hungry data centers and still allow them to be situated where they are needed and enable the re-usage of heat. As a shameless plug, I am the co-founder of https://submer.com .

[1] We covered this in our blog (https://submer.com/microsoft-testing-undersea-immersion-cool...) [2] https://www.theguardian.com/environment/2018/feb/20/much-wor...


Hi Polvs, I believe we met in Berlin. I hadn't seen immersion cooling before then but was absolutely impressed. Unfortunately setup costs are quite restrictive, but for large scale organisations it does make sense.


Hi Tom, I think I remember you from the C3 Crypto Conference where we had a booth. I know what you mean, the initial cost of acquiring this technology compared to regular air cooling technologies is more expensive in the very short term, but when you consider the operational costs of electricity, the ROI can be under a year.

It becomes much more interesting when you also factor in that you can pack >4x computing power into the same physical space saving a lot of real estate and don't need the huge CAPEX in expensive prepared server rooms nor plan ahead the “hot-spots”.


Why hasn't this taken up pace though? I am pretty sure I read about immersive cooling at least 4 -5 years ago. And 3M had some new "liquid" shown 1 - 2 years ago. Given the scale of which now Google, Amazon, Microsoft and Apple operate, surely they should be the first one to adopt?

Edit:

Looks like I asked the question a while ago. From Anandtech and Servethehome:

10 years ago, liquid had a thermal limit. Now it is around 5x that old limit. Problem is cost ($$$/gallon of the liquid) and installation. Has to be marketed on TCO. Also, issues with submerged fiber connections

2U air cooling can reliably handle 8x 200w+ TDP CPUs, 600w of NVMe, plus RAM and add-in cards. 4U designs can handle 6kW of cooling on air. Liquid may be more efficient, and more data centers are being built for it, but air is easy to deploy.


2016 article... but still cool!

*edit, no its not, I have too many tabs open and mixed it up with the blog post listed in the comments. ignore me



I think you may be thinking of something else. This was posted today, an hour ago, on the BBC.


Im pretty sure I heard about this a couple of years ago too


They did a mini-version in the US a few years ago, this one is 4x the size and 100x the compute


Bit of clickbate there.. Thought this was about them buying Github ;)


Great, let's boil the planet


That's what they're looking to avoid, by using passive cooling instead of less efficient refrigeration.


Seems like a good way to create a new country. The country of Microsoft. Go easy on me down voters, I’m just imagining it from the perspective of China, Russia, North Korea, Syria, Iran, Cuba, Libya or Canada doing the same thing and the controversy that would follow.


For years we've used water to cool our CPUs ... what happens if we drop the entire datacenter into water?


Why are you just restating the question of the article?




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: