They're not. This story isn't very good. The point it makes is somewhat valid, but the reporter's lack of understanding on the details means the way the story is written doesn't provide much insight.
Likewise, I'm guessing "utilization" is measured by taking a loadavg, which is... not exactly really a complete measure of utilization.
Well they could be..
Demand Side Response is a mechanism where utility companies pay energy consumers to stop using grid power at peak times. Effectively a company is treated as a power generator but instead of selling power to the grid they just consume less either by turning stuff off or relying on on-site power generation for a while.
It's popular in lots of industries but there seems to be relatively little (publicised) adoption in the data center business yet.
The growing datacenter trend is to build dedicated co-generation facilities (typically, with the power company footing half the bill because you're doing them a favor and taking load off of the local grid).
In the US, generator power is considered dirty, and only meant to be run on during emergencies. Running generators during not-emergency times will typically get you a nice call/visit from the state's environmental agency courtesy of your complaining neighbors. Said agency will ask you to curtail your usage to no more than X hours per month.
In the UK, DSR consists of you entering into a contract with National Grid, one or four years in advance, to produce energy during peak times (not just "reduce usage") at an auctioned rate, but pay a stiff penalty if you fail to deliver. Plus, you're not allowed to participate in back to back years.
Likewise, I'm guessing "utilization" is measured by taking a loadavg, which is... not exactly really a complete measure of utilization.