Hacker News new | past | comments | ask | show | jobs | submit login

Does anyone know if there are numbers somewhere for how much energy the different cloud providers use? Would be interesting to see which is more efficient from that perspective for those of us who care about such things. Energy per $ revenue or even better per compute would be helpful.



A cloud provider that specializes in compute will have vastly higher energy costs than a cloud provider that specializes in storage. Even without explicit specialization, it's not unreasonable to assume that the distribution of compute-heavy and storage-heavy customers is quite different between providers. So you'd have to sort of calculate that out.

But if you do that, I'm not sure what would even differentiate them. They're all using essentially the same hardware. They all have the same cooling requirements. I wouldn't expect an average to be very interesting.


> I'm not sure what would even differentiate them

Distance from carbon-neutrality per dollar of revenue sounds like a good start.


Carbon neutrality is a flawed statistic IMO and can be doctored with a lot - e.g. if Google pays a company X to plant an amount of trees, who's to say that neutralizes carbon?

Keep it simple, just publish how much power they're using and what the sources are.


I’m not sure what you mean by publish “what the power sources are.” Data centers are grid connected and are therefore connected to a lot of different generators—including coal, natural gas, wind, solar, and hydro. All generators on the grid contribute to the grid. It’s really tough to single out a generation source for a particular user.


I remember walking by a google data centre on the Columbia river years ago and being told that it was sighted there so that it could use the power from the dam.


Or for water cooling the CPUs?


Personally, I don’t understand carbon neutrality. How deep does this rabbit hole go? If Google only buys and uses fully electric cars for street view and use only solar and wind to power them is that good enough? Does the manufacture of the car, batteries, solar panels also have to be carbon neutral? If not, can a company become carbon neutral by simply letting another company do the dirty work?


Yes. It goes all the way down. That is what makes it very, very hard to fix.


Sure but that's going to mainly depend on where their energy comes from, not how much they use.


I know that Microsoft's datacenters, and probably many other, are parked next to cheap power sources like hydro plants that aren't near any major demand sources.


Cooling is a pretty big factor in the energy usage of datacenters, and there's a lot of room for creative use of technology there - or simply geographic advantages.


It's a lot smaller than you think these days with all the effort that has gone into it.

Google reports an average PUE[1] of 1.11 over the past year, 1.09 over the past quarter, and the latest/best data centers are at 1.06 [2]. In simple terms this means that the total power overhead for cooling etc is just 6% of that to power the machines in them.

Google (and others) have gone to great lengths using AI, and radical new ideas for cooling.

Disclaimer: I work at Google, but not in Datacenters. All info public domain

[1] Definition of PUE: https://en.wikipedia.org/wiki/Power_usage_effectiveness

[2] Source: https://www.google.com/about/datacenters/efficiency/


You can use steam from cooling to produce electricity just like nuclear plants do, right? :)


No, because computer chips don't operate at 100°C (let alone 300°C like a nuclear reactor). Low-temperature heat flows contain little usable work. Typically the best option is to dump the spare heat into the nearest body of water, or into the air.


I agree it's rather strange to ask for who has the lowest energy usage when you're a cloud customer. "I want to rent this computer to do X, but it shouldn't consume any electricity" is not a reasonable demand.

However: one could say that these companies are now big enough that it is reasonable to start demanding that these companies source the energy in a climate-friendly way. I think it's not at all unreasonable to demand that when Google or Amazon builds massive new data centers to be responsible about how those data centers are powered.


I'm sure Google will have no problem providing a checkbox for you that will double your prices.


Google is 100% renewable and is in my opinion the leader among the giants (AWS, Azure, Google, Ali) but there might be smaller companies that is even better.

https://www.blog.google/outreach-initiatives/sustainability/...


Google is on the list of companies working to remain carbon neutral.

...Microsoft is setting the goal of being carbon negative (since the inception of the company in 1975): https://blogs.microsoft.com/blog/2020/01/16/microsoft-will-b...


Google is not really 100% renewable in any meaningful sense. Google (and others) buy renewable energy, but that doesn't mean their purchases match up to what their data centers actually consume for the simple reason that data centers have a pretty consistent load and wind and solar do not produce electricity consistently. For truly progressive companies in this area, they need to consider matching the hourly generation and loads, not not just annualized generation and load: https://cleantechnica.com/2019/05/27/100-renewable-does-not-...


This seems like an overly negative assessment. If Google can pay someone else to use solar during the day instead of coal surely we can give them credit for being 100% renewal. After all the opportunities to do this temporal shifting of renewables are exhausted, maybe then it's fair to level this criticism.



How effective are carbon credits at eliminating footprint? If company X runs an efficient datacenter with 80 GW wind and 20 GW coal power, and company Y runs a dumpy datacenter with 200 GW coal power, but company Y buys 200 credits, is company X still the one doing the most harm?


For now, company Y is probably doing better (purely from a CO2 emissions standpoint). But it's not sustainable, because some of those credits are from companies that just reduced their usage below some threshold. Most of the credits are not from "negative emissions" like planting trees. So as the thresholds decrease over time, there will be fewer credits available to buy and they will be stuck with either really expensive credits or a surplus of CO12 generation.


It's much more complex than this. You can't run a data center on wind. Data centers are all grid connected, so what really matters is the composition of generators on the grid (or at least the composition on the generators near the datacenter).


> It's much more complex than this. You can't run a data center on wind.

You certainly could run on it on a dedicated combination of any mix of wind/solar/hydro power and stored (e.g., battery or regenerative fuel cell) power you wanted to.

In practice, you'll probably want to connect it to the grid, but directed purchases on the grid, while they don't actually select which source really powers the DC, have all the practical effect of doing so.


You could do that, but no is no one powering their data center that way. They are all grid connected.


so amazon the worst. google does quite well.


The providers are already financially incentivised to minimise their power consumption. I’m not sure any change is needed. I suppose the power source might be relevant to the customer (eg. Coal vs hydro, etc).


Should I expect ARM processors to be the most energy efficient for compute? Eg if I’m in AWS, should I be choosing graviton instances?


The answer seems to be yes for the latest Gravition2 according to initial benchmarks.


This needs to include the energy cost of data transmission to/from cloud data centers as well for a fair comparison. These server farms don't exist in isolation.


Only if you assume that there are substantial differences in those costs between the different cloud providers.


I'd say so. It makes a difference when you're doing things like hyping Google for using only renewable energy sources for their cloud data centers. OK, cool, but all the network infrastructure along the way doesn't.

Only somewhat related, but I'm quite certain that on premise solutions would win over most cloud services if the energy cost of data transmission was factored in.


Energy usage and compute amount are company secrets that they simply won't discuss in the open.


All of these vendors will brag about their pue and energy sourcing. Nobody will tell you their electric bill, because it’s meaningless to you and meaningful to competitors.

Google regularly brags about their facilities and you can find info with 30 seconds of research. AWS is a little more cagey, they will brief customers. Microsoft is more like Google.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: