Hacker News new | comments | show | ask | jobs | submit login

I don't think you are correct from an information computation point of view. When looking at the computation done by neurons in the brain it is sufficient to abstract away the lower-level substrate in which it occurs.

You and other posters are all correct regarding the huge volume of information on which human minds are trained. it's hardly unsupervised learning either :)


EDIT: In response to your comment, I've given it further reflection. DNA as source code may be misleading as an "upper bounds". After all, suppose we knew for a fact (assume there were a mathematical proof or anyway just assume axiomatically) that a one hundred megabyte source code file completely described (contained every physical law etc) a deterministic Universe, and that if you ran it on enough computation to fully describe a hundred billion galaxies with a hundred billion stars each, one of those stars would have a planet and that planet would contain humans and the humans at some point in the simulation would deduce the same one hundred megabyte source code file for their Universe. (This is a bit of a stretch as it's not possible to deduce the laws of the universe rigorously.)

Anyway under that assumption in a way you could argue that the "upper bounds" on the amount of entropy that it takes to produce human intelligence is "just" a hundred megabytes, since that source code can deterministically model the entire Universe. But practically that is useless, and the humans in that simulation would have to do something quite different from modeling their universe, if they wanted to come up with AI to help them with whatever computational tasks they wanted to do.

In the same way, perhaps DNA is a red herring, as there are a vast number of cells in the human body (as in, tens of trillions) doing an incredible amount of work. So starting out with DNA is the "wrong level" of emulation, just as starting out with a 100 MB source code file for the universe would be the "wrong level" of emulation, even if we posit axiomatically that it fully describes our entire Universe from the big bang through intelligent humans.

So I will concede that it is misleading.

All that said, I think that emulating or considering the computation on the level of neurons is sufficient - so it is sufficient to look at how many neurons are in the human brain and the kind of connections they have.

As for the efficacy of this approach - that's the very thing that is being shown in the story we're replying to and many places elsewhere. It works. We're getting extremely strong results, that in some cases beat humans.

I believe that emulating or comparing to humans at the neural level should probably be sufficient for extremely strong results. We do not need to emulate every atom or anything like that. I consider it out of the question that we would discover that human minds form a biochemical pathway into another ethereal soul-plane and connect with our souls in a way that you can't emulate by emulating neurons and the like, and that the souls are where intellect happens and brains are just like "radio antennas" for them. Instead, I think that the approaches we're seeing will achieve in many ways similar results to what humans brains produce computationally - a much higher level of abstraction is sufficient for the results that are sought.

I will confess to not being an expert, but I disagree: I don't think it's sufficient to abstract away the lower-level substrate when the OP was referring to DNA as source code, which absolutely depends on that level of detail to both construct the system (the brain) and to enable the continued development and maturation of that system (a physical, real brain).

I was not referring to the huge volume of information necessary, as I acknowledge that as being "outside the system" for purposes of this discussion, so my apologies for any confusion I might have caused.

It may be possible that (and it is my belief that) there is a higher-level abstraction for the computations taking place in the brain, even if it is on the neuron-level, but at that point I don't think you can claim that the source code for that is going to fit under 700MB by using DNA as a baseline.

you are right, I went too far. see my other comment:


however, that just concerns the DNA argument. We know roughly how many neurons there are in humans and their connectedness.

> (This is a bit of a stretch as it's not possible to deduce the laws of the universe rigorously.)

Not sure I agree. This is known as the problem of induction, and Solomonoff tackles it quite well I think.

Certainly it is not possible to deduce every law governing the Universe! For example posit there is source code for the Universe; then assume a second Universe that is identical except its source code differs in that after 35 billion years suddenly something happens - but since there is no way to investigate the laws directly, nobody inside could possible "deduce" the complete set of laws, there is no way to know that the source code contains a check that gets tripped after 35 billion years. Since only one of the two versions is the one running the universe, but it is not possible to determine which one, therefore the inhabitants of the Universe cannot deduce the laws that govern them: for the first 35 bn years the two universes are bit for bit identical.

So if I wanted to assume there is a one hundred megabyte source code that somehow was actually exactly the source code governing the entire Universe, I'd have to assume that axiomatically. We could never "deduce" it and know for certain that it is the only possible source code and our Universe must be following exactly it and nothing else.

At least, this is my thinking...

You absolutely can deduce those laws, which is what Solomonoff induction does [1]. It's been formally proven to converge on reproducing the correct function in the limit.

> Since only one of the two versions is the one running the universe, but it is not possible to determine which one, therefore the inhabitants of the Universe cannot deduce the laws that govern them: for the first 35 bn years the two universes are bit for bit identical.

Correct, until they're not, at which point you can distinguish them. Until then, you prefer the laws whose axiomatic complexity is lower. It's basically a formalization of Occam's razor.

[1] https://en.wikipedia.org/wiki/Solomonoff's_theory_of_inducti...

I was using deduce as 'arrive at the actual set of laws (the laws 'deduced' perfectly matches reality indefinitely going forward) and know correctly that they must be the actual set currently running the universe.'

In your phrasing, it seems that you would assume this following sentence does not contain a contradiction.

> Alice in the year 23,435,342,435.03453725 correctly deduced all laws governing her Universe and through a clever mathematical trick was able to correctly deduce the state of the universe in the year 24,535,342,450.03453725 (15 years later, i.e. '50 instead of '35), which a mathematical trick allowed her to predict with absolute precision without having to model the entire Universe. Having deduced the correct set of laws, she knew for certain that that state at '50 would include x, let's say atom #349388747374123454984598323423 in that Universe must be a hydrogen atom. She knew this with certainty, because she had deduced the laws governing her Universe and that they could not be any other set of laws. The laws were also deterministic. She was also correct in her calculations. Therefore, since she made no mistake in her calculations, had deduced the correct set of laws which was actually running the Universe, she was correct that atom number #349388747374123454984598323423 would be a hydrogen atom. One thing to note however, is that the whole universe happened to have an "if year ==24,535,342,449 then all atoms turn into nitrogen atoms wherever they may be" which someone had added to the source code as part of an experiment. Therefore, although she had correctly deduced that atom #349388747374123454984598323423 in the year '50 would be a hydrogen atom, in fact when that year rolled around it was not one. That doesn't stop her deduction from being correct and proper, or mean that she had not deduced the laws with certitude or been correct. In effect, it is possible fro me to say with complete and justified certitude that "next year x" as long as I have completely rigorously deducted it, and I will be correct in my rigorous deduction and in thinking that I have 100% chance of being right, even if next year not x."

you see my struggle?

For me, if you "deduce" that something "must" be exactly some way, you can't possibly have been right in your deduction if it turns out that you are wrong.

Nobody can deduce that Pi cannot equal 4, if, in fact, there is any chance that it is indeed 4 (under that axiomatic system). That's not a deduction: it's a logical fallacy.

so you are you using an extremely different definition of "deduction" as I am. I can deduce that within standard axiomatic mathematics, Pi is not equal to 4. After making that correct deduction, there is no conceivable Universe in which I am wrong. (Seriously - it's not possible that in another universe someone exploring ZFC+ would find that Pi is four there.)

Held to such rigor, there is nothing that I can deduce with certainty about the laws that govern our Universe - so that it is impossible that some other laws or a larger set of laws might in fact govern it.

However, that is exactly the state that I wanted to assume as axiomatic for my argument. (I wanted you to assume as axiomatic that the complete set of laws or source code could be 'deduced' and as guaranteed to be correct as we have a guarantee that under ZFC+ pi isn't equal to 4.)

if we knew that "these 100 megabytes exactly model our Universe, and by the way deterministically - if you run it on enough hardware to simulate a hundred billion galaxies with a hundred billion stars, one of them has a human around year 12 billion" and we are guaranteed that it is exactly the laws of our universe with the same certitude that we are guaranteed Pi isn't equal to 4 under ZFC+ -- well --- that is not the level of certitude that Physics is able to confer :)

Guidelines | FAQ | Support | API | Security | Lists | Bookmarklet | DMCA | Apply to YC | Contact