> It costs $700,000 per day in energy costs to run ChatGPT 3.5, according to recent estimates, and leaves behind a massive carbon footprint in the process.
Compared to what? I wouldn't defend LLMs as "worth their electricity" quite yet, and they are definitely less efficient than a lot of other software, but I'd still like to see how this compares to gaming consoles, or email servers, the advertising industry hosting costs, cryptocurrency, and so on. Just doesn't seem worth pointing out the carbon footprint of AI just yet.
Any article citing the power usage without calculating it in terms of users of queries is just trying to push an agenda by omitting how many people are using it.
When you make something more available/cheaper, overall usage often goes up.
Overall energy use is an important metric regardless of energy per task.
Airplanes are WAY more efficient per passenger than they were in the past, but it's still valid to express concern over the energy usage and pollutants of air travel with so many more routes being flown.
How many people are using it because it's being offered at far below operating costs, even before you factor in the externalities of massive energy use?
I meant more that the % is so low that even if the usage is on top of all other usage (not a completely clear statement to make), it's like starting to mention any other thing in the long tail of technology leaving behind a "massive carbon footprint". Yes, it matters, especially if you were making a report focused on sources of carbon footprint, but in general, saying "AI carbon footprint is bad" just seems like wanting to give it a bad name. In reality AI tech doesn't seem to be such a big contributor percentage-wise. Of course it should still be optimized, not arguing that.
Just because the carrier of energy is source independent, doesn't mean the consumer of that energy is not responsible for the carbon emissions of its production. Since we're talking hundreds of TWh[1], the policies of those consumers can have a massive impact on global emissions.
> It's not like AI replaced anything you mentioned.
I mean it's true it hasn't replaced anything the OP mentioned, but it has definitely replaced parts of the compute that I would normally use for e.g. searching.
If people are activating Google's servers less, that's some energy saved. I don't know how it compares, but I guess OpenAI aren't running a live bidding war with advertisers on every request.
It’s fair in some cases, but indeed - not in others - e.g., code completion with LLMs is a time saver and worth paying for, compared to any other tech out there. At least this is how I see it, I know some people say it introduces mistakes but this is up for debate.
I do think if we look at translation tasks, grammar correction, information look up, etc. it adds a competitive convenience factor but I can’t say that running state of the art GPUs at very high wattages for up to a minute to do what specialized software can enable you to do by running for some milliseconds on much lower wattages isn’t less efficient. I’m referring to running multiple Google searches yourself to answer a question, or using a more traditional translation service, spell checker, and so on.
I asked GPT-4 to estimate how much CO2 this likely emits, in units of "typical car usage in a city." It suggests this emits roughly as much CO2 as Reno or Des Moines. That's staggering, but there are about 100 cities this size in the US, so decreasing car usage 1% would more than offset this. I know this is a bizarre comparison to make, but CO2 emissions are fungible.
Compared to what? I wouldn't defend LLMs as "worth their electricity" quite yet, and they are definitely less efficient than a lot of other software, but I'd still like to see how this compares to gaming consoles, or email servers, the advertising industry hosting costs, cryptocurrency, and so on. Just doesn't seem worth pointing out the carbon footprint of AI just yet.