

Ask HN: Carbon emissions from the Engine Yard contest? - MicahWedemeyer

Way back in the day, I overclocked my desktop computer so I could run SETI@home faster.  I've always wondered how much electricity I wasted by running SETI instead of shutting it down when I wasn't using it.<p>Anyone care to guess how much electricity (and resulting CO2) has been used for the Engine Yard contest?  I'm not saying the contest was some sort of ecological disaster, but I would suggest that perhaps EY should put up a little money toward offsetting the carbon footprint generated.  Plant some trees or something.
======
rcoder
Here's my quick back-of-the-napkin estimate:

Given that the winning teams seemed to be running clusters of around 20-30
systems full-bore for >24 hours, the power draw would be something like 500
watts * 20 systems * 24 hours, giving us something on the order of 240 kWHr
consumed per team. (Some teams may have consumed far less, but the successful
ones seemed to be running many CPU or GPU cores on relatively high-spec
servers.)

According to the DoE[1], the average CO2 emissions for power generation
nationwide is 1.3 pounds per kWHr. Given that, we're get about 312 pounds of
carbon per team. A single cross-country airline flight can produce several
thousand pounds of carbon, so all told, the contest probably has a small
carbon footprint than a major business sales meeting or small tech conference.

1 -
[http://www.eia.doe.gov/cneaf/electricity/page/co2_report/co2...](http://www.eia.doe.gov/cneaf/electricity/page/co2_report/co2report.html)

~~~
MicahWedemeyer
Exactly what I was asking for. Thanks!

------
synnik
This would be difficult to calculate, because many of the resources dedicated
to EY would still be running without the contest - they just would be doing
something else.

Figuring out how much extra power is used solely for EY, that would otherwise
be turned off?

You'd need to poll all participants and find out what hardware they are
throwing at it vs. their normal operating state. I think it would be a
worthwhile effort, but might be fairly tedious.

~~~
MicahWedemeyer
Yeah, with my SETI example, I overclocked my computer mainly for the purpose
of running the SETI client faster. I assume that an overclocked machine uses
more power, and that an overclocked machine at 100% CPU utilization uses
significantly more.

I'd be interested mainly in a ballpark/analogy figure. Are we talking "I
forgot to turn off the night-light" or "Could power 5 homes for a year"?

