More than the residential use of some entire cities.
This is a misleading comparison. "AI data centers" aren't a residential use, they're much closer to an industrial one, and they should be considered on that basis.
In comparison, a 2007 report on the US Aluminum industry (https://www1.eere.energy.gov/manufacturing/resources/aluminu...) (aluminum is a particularly heavy user of electricity, since its main processes are electric-driven and not chemical/combustion like steel) notes that the US industry used 90e9 kWh/yr in 2003. That's 10GW of power on a continuous basis, comparable to the data center usage discussed in the article.
An exact comparison is more difficult because the article discusses a number of individual projects, without aggregating them over time.
Is this projected energy usage good? I don't know, but the statement that "AI will be as important to the United States as the aluminum industry" doesn't seem too outlandish.
It also assumes no progress in power efficiency during inference.
If the price of OpenAIs mini models are indicative of compute, and therefore, energy usage, I would be surprised if we didn’t see a lot of similar energy optimized models in the future.
Computing as a whole has become a lot more efficient compared to say 30 years ago but globally certainly use a lot more compute to make up for that increase in efficiency.
Yeah I don’t believe for a second that we’re going to ever see a net decrease in AI energy use during our lifetimes at this point.
Besides, the point of discussing this isn’t that AI isn’t efficient enough, it’s that there is a clear path to it consuming absolutely massive amounts of energy, and that will continue to scale at an increasing rate indefinitely. This is a reality we need to expect and plan for. Substantial investments must be made - today - in renewable energy production and storage to meet new demand.
This is a misleading comparison. "AI data centers" aren't a residential use, they're much closer to an industrial one, and they should be considered on that basis.
In comparison, a 2007 report on the US Aluminum industry (https://www1.eere.energy.gov/manufacturing/resources/aluminu...) (aluminum is a particularly heavy user of electricity, since its main processes are electric-driven and not chemical/combustion like steel) notes that the US industry used 90e9 kWh/yr in 2003. That's 10GW of power on a continuous basis, comparable to the data center usage discussed in the article.
An exact comparison is more difficult because the article discusses a number of individual projects, without aggregating them over time.
Is this projected energy usage good? I don't know, but the statement that "AI will be as important to the United States as the aluminum industry" doesn't seem too outlandish.