Hacker News new | past | comments | ask | show | jobs | submit login

What do you understand those conditions to be?





Output quantity consumed (almost) always increases with falling inputs (ie, costs, whether in dollars or GPUs). But for Jevon's paradox to hold, the slope of quantity-consumption-increase-per-falling-costs must exceed a certain threshold. Otherwise, the result is just that quantity consumed increases while quantity of inputs consumed decreases.

Applied to AI and NVIDIA, the result of an increase in the AI-per-GPU on demand for GPUs depends on the demand curve for AI. If the quantity of AI consumed is completely independent of its price, then the result of better efficiency is cheaper AI, no change in AI quantity consumed, and a decrease in the number of GPUs needed. Of course, that's not a realistic scenario.

(I'm using "consumed" as shorthand; we both know that training AIs does not consume GPUs and AIs are also not consumed like apples. I'm using "consumed" rather than the term "demand" because demand has multiple meanings, referring both to a quantity demanded and a bid price, and this would confuse the conversation).

But a scenario that is potentially realistic is that as the efficiency of training/serving AI drops by 90%, the quantity of AI consumed increases by a factor of 5, and the end result is the economy still only needs half as many GPUs as it needed before.

For Jevons paradox to hold, if the efficiency of converting GPUs to AI increases by X, resulting in a decrease in price by 1/X, the quantity of AI consumed must increase by a factor of more than X as a result of that price decrease. That's certainly possible, but it's not guaranteed; we basically have to wait to observe it empirically.

There's also another complication: as the efficiency of producing AI improves, substitutes for datacenter GPUs may become viable. It may be that the total amount of compute hardware required to train and run all this new AI does increase, but big-iron datacenter investments could still be obsoleted by this change because demand shifts to alternative providers that weren't viable when efficiency was low. For example, training or running AIs on smaller clusters or even on mobile devices.

If tech CEOs really believe in Jevons Paradox, it means that last month when they decided to invest $500 billion in GPUs, then this month after learning of DeepSeek, they now realize $500 billion is not enough and they'll need to buy even more GPUs, and pay even more each one. And, well, maybe that's the case. There's no doubt that demand for AI is going to keep growing. But at some point, investment in more GPUs trades off against other investments that are also needed, and the thing the economy is most urgently lacking ceases to be AI.




Consider applying for YC's Spring batch! Applications are open till Feb 11.

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: