Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

I’m not able to read billions of books in less than an hour.

I think you underestimate the sheer volume of data + conclusions the brain ingests and processes on a daily basis, primarily through unconscious experience.



The training set for GPT-3 is about 500e9 tokens; any given synapse in a human in their lifetime is going to fire about 2e9s * (10% * {100Hz to 1000Hz}) = 20e9 to 200e9 times.


It sounds like you agree with them, since there are a lot of synapses.


Au contraire.

While our brains are more complex than the networks, this has never been in dispute.

The quantity of experiences needed to train GPT-3, however, is many more than we are capable of experiencing in a lifetime.


GPT-3 is learning from ridiculously dense data through.

Also, about volume of data processed by the brain: https://gwern.net/Differences


Not not mention the millions of years building our nervous system.


100's of millions of years.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: