Hacker News new | past | comments | ask | show | jobs | submit login

>>No, I'm literally counting those requirements. Hence routers + server (+100% for overhead, like cooling).

I would not say you're "counting", you're rather doing a back-of-the-envelope calculation with eye-balled numbers.

These estimations are difficult by nature. I don't "trust" the report per se, but they do provide sources, from which I can inform myself. Which is more than I can say about your calculations, which come without sources.

Digital tech is one of the fastest growing carbon polluters, and data centres alone account for almost 50% of the emissions. Now, the specific attribution to online video may be less clear-cut, but the trends have been replicated in multiple studies. You may also be interested in this recent report: https://www.sciencedirect.com/science/article/pii/S095965261...

Can you point me to this inconsistency? >>Why do you trust that report? The numbers don't even make sense; just as an example, dividing the watched VoD time by the consumed VoD data, you get 24Mb/s, yet they say on the same row, "Average Bitrate : 3 Mbps". That's a 8x difference!




> I would not say you're "counting", you're rather doing a back-of-the-envelope calculation with eye-balled numbers.

Yes, of course. My point is that I wasn't forgetting them.

> I don't "trust" the report per se, but they do provide sources, from which I can inform myself.

And did you?

> Now, the specific attribution to online video may be less clear-cut

That's the thing, though. We know that streaming a video is a very low-power activity, since we can do so with a very low power device. And I added 100% of overhead, whereas the PUE of a real datacenter is actually about 1.09 nowadays.

So any report that tries to tell me Google or Amazon, who spend millions on DC energy improvements, are wasting orders of magnitude of energy more, is not a serious report.

A very big issue is that they assume a linear correlation between transferred data and power use. For the network part that may be reasonable, but for the DC is absolutely is not. Mining cryptocurrency, for example, takes a huge amount of energy to produce a message of a few KBs.

> Can you point me to this inconsistency?

It's in the Technical details of the report.


>>A very big issue is that they assume a linear correlation between transferred data and power use.

Right, so they do have to make lots of assumptions (same as all reports I've seen that are trying to estimate carbon footprint) - e.g., about the type of network, the device, the energy mix supplying DCs etc etc. For instance, only 2% of networks in a country like Germany go over fibre, rest is copper. In Italy lots of people outside major cities use LTE for regular Internet access - absolutely insane in terms of energy use.

>>any report that tries to tell me Google or Amazon, who spend millions on DC energy improvements, are wasting orders of magnitude of energy more, is not a serious report.

You're talking past the point here, seems to me. The report is _not_ claiming that any specific technology is "wasting energy" in terms of efficiency - it poses the question of whether this energy is spent "wisely", seeing that supposedly the world is trying to reduce global emissions, and the consumption of say DCs or online video is growing constantly. Surely, not an unreasonable question to be raised, don't you think?

Amazon may be efficient but their energy comes mostly from non-renewable sources. On a scale of 1-10 how important do you think this use of dirty energy is?




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: