Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

There are models much larger than BERT with a much larger footprint, GPT-3 being the most well kniwn example.

Models like BERT aren't just trained once either when they are developed, but trained again with different domains, different parameters, different tasks in some cases. There is also fine-tuning (more frequent, less carbon intendive), so these are real environmental problems, and others have pointed them out.



Google claims they neutralized their legacy emissions over the entire history of the company.

Now about the billion cars on the road and the 40% of the world’s electricity being generated from coal.


The existence of other problems does not make a particular problem go away and humanity can in fact focus on more than one thing at a time.


Google is carbon neutral.

How much more of a problem are a billion cars and 40% of our electricity being generated from coal?

We’ve squandered decades ignoring the big problems and now people want to run into the weeds with a thousand little problems, that individually don’t amount to much.




Consider applying for YC's Winter 2026 batch! Applications are open till Nov 10

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: