Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

I completely agree that the human brain is pre-wired for language; and in fact we also have experimental evidence that adults are instinctively doing the right thongs for teaching babies language.

But the human brain has 86 million neurons total for everything it does, while GPT-3 uses 175 billion ANN parameters just to read and write digital text in English. To my mind this also supports the idea that current models are at least 3 orders of magnitude too computationally expensive as compared to humans.



>86 million

*billion

"The human brain contains 86 billion neurons, with 16 billion neurons in the cerebral cortex."

https://en.wikipedia.org/wiki/List_of_animals_by_number_of_n...

The difference is not 3 orders of magnitude, but merely a factor of ~2.


Yeah, sorry that was a typo, I did not base my argument on the 86 million number.

I'm saying, if our brain has 86 billion neurons, there is a vast majority of it that is not used for language. And even in the language center, those neurons are used to understand written language input and output in digital as well as analog form across multiple fonts (so including OCR which GPT-3 does not do), and spoken language (both hearing and speaking. Moreover, humans can learn many languages with different grammars, in addition to body language cues, dialects and sociolects and slang, and humor/sarcasm.

So I still argue that is a 3 order of magnitude difference.


Moreover, an ANN parameter is not equivalent to a human neuron.


And at the end of that, GPT-3 will happily emit sentences that are grammatically correct but free of meaning, while a 2 year old utters word-strings that are full of meaning but grammatically poor.




Consider applying for YC's Fall 2025 batch! Applications are open till Aug 4

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: