Hacker News new | comments | show | ask | jobs | submit login

By your own argument, any program that a single human being can write can also contain only 700MB of source code, including graphics and videos etc, which is obviously not true.



you are right, I went too far. but the size of DNA is a good indication of how complex the high-level wiring and topology of neurons likely is, that is, until that neural net has been been trained at all.

it takes years to train a human neural net to even been able to speak (toddlers.) But regarding the architecture, there's good reason to think that the wiring for it is not infinitely complex, taking trillions of terabytes etc - or anywhere close to that.

I think that on some level it is fair to say that the wiring that makes people's brains human is in some sense encoded in DNA.

However, I will grant that this might not be helpful. After all, if we posit or assume as axiomatic that the whole universe is deterministic and here, these 100 megabytes of source code can model it and if you did that with 10^100 bytes and FLOPS, you'd after a few days of running it you'd find humans inside - well, then, sure, in some sense 100 megabytes "describes" the entropy inside human brains, but in another sense that is not useful in any way.

so perhaps DNA is the same and the "knowledge" that 700 MB or so can produce a human if you run it in biological substrate after carrying it out in a womb, and then raise it with human culture for 72 months it will speak just fine, is not necessarily an indication of what kind of information we would need to give a neural net and how much coding it would take to take on the same behavior. So while to me personally it remains an indication, I agree I went too far.


I honestly disagree. Because in that case any evolution of anything would be impossible. Take a look at Wolfram's New Kind of Science for an analysis of how massive complexity arises from simple beginnings.

I will go further and tell you thay the state of AI so far is nowhere as advanced as you think. It is not a flexible intelligence that can figure out connections between concepts (look up Cyc for the closest we've got). It's PARAMETRIC MODELS made by HUMANS where a neural net simply runs an algorithm written by HUMANS to optimize something. It's just an optimization problem.

You know where all the ingenuity and progressin machines comes from currently? The ability to make PERFECT COPIES AND SEARCH QUICKLY. You see, learning takes a long time and imperfectly copies information. Books could be copied but search is slow. Now with computers we can try algorithms out and the ones that work better are SELECTED FOR AND COPIED and anyone around the world can contribute. That is also why open source outcompetes closed solutions.

You will find that the huge innovation in technology and software from 40s and 50s til now has all been because HUMANS added to the snowball and replicated good ideas. The ideas themselves require HUMANS to generate new ideas.

Show me one AI today that can form connections between concepts to teach itself to solve arbitrary problems, instead of just optimizing some parametric pattern recognizer.


it would be a massive breakthrough if such an AI existed today, and it doesn't.

Due to the amount of hardware that we have to throw at the problem, it may, however, happen at any moment. I am saying we have more than enough hardware for general intelligence to arise out of server farms - but, yes, the millions of years of evolution that gave rise to humans hasn't occurred. it would be a breakthrough, yes.




Guidelines | FAQ | Support | API | Security | Lists | Bookmarklet | DMCA | Apply to YC | Contact

Search: