> The computer is still doing only syntax, even while appearing to do semantics.
Why do you assume people are not doing the same?
> But humans really do semantics. Nobody questions this, or challenges it, because it is self-evident.
It is no more self evident than your claim that computers don't do semantics.
Your argumentation is circular:
IF computers can't do semantics, and IF humans do, then clearly the humans and computers must be fundamentally different. But your claim that computers can't do semantics (as opposed to, a claim that we don't yet know how to program computers to do semantics) is assuming that humans and computers are fundamentally different - if we are not, and the materalist hypothesis holds, there is simply no basis for assuming that we are fundamentally different (as categories; clearly current computers are not structured as human brains).
> It is easier to be an idiot, because that doesn't put your funding in jeopardy.
Less name calling, and more of an argument that isn't resting on a logical flaw might be preferable.
> You apparently don't understand how computers work.
You appears to like to jump to conclusions.
> There is no "meaning" associated with any variable or its value, and there cannot be.
Please give me a definition of "meaning" that can be applied to a human brain and not to a computer. And what is the evidence that it can not be applied to a computer?
You are also jumping to the conclusion that, if as you claim, the brain depends on something that can't be explained under the materialistic hypothesis, that a computer cannot, yet you have not even presented an argument for why that would be so.
> Please give me a definition of "meaning" that can be applied to a human brain and not to a computer.
You associate a meaning to the symbol "dog". You think of an animal that barks, wags its tail, chases cats and squirrels, and is happy to see you when you get home. You associate something in the real world what that symbol. (That is the very meaning of doing semantics.)
The computer does nothing of the sort with the symbol "dog".
Please provide a definition of "associate a meaning to the symbol" that can be applied to a human brain and not a computer.
The plain reading of what you've written above is so trivially simple to implement in a computer that it is covered in every introductory algorithm course, so I presume you have either given a definition of "meaning" that is overly simplistic, or have a definition of "associate a meaning" that is substantially more complicated than a plain reading of the words.
What is this thing we call "meaning"? Can it be a point in a vast and complex "pattern space" and the meaning of that point is the structure of all the "paths" that map that point to other points in this same pattern space?
Our act of arbitrary categorization of patterns into "syntax" and "semantics" just seems to obscure that fact that they are just two arbitrary encodings of patterns.
How can we know that one pattern is equal to another pattern in a different encoding? Don't we agree on these mappings with other pattern spaces (read humans)? As we communicate, so do our pattern spaces start interacting, and all we can hope for is a convergence on the mappings sans encoding.
> Can it be a point in a vast and complex "pattern space" and the meaning of that point is the structure of all the "paths" that map that point to other points in this same pattern space?
That's the exact point of contention, whether semantics can be represented by syntax is unknown currently, though it must hold in a materialistic world. If it can't, as some believe, then it isn't an arbitrary categorization.
Isn't the only issue here that we have here one set of spaces (brains) with that structural understanding (link/edge), and another set without that structural understanding?
It seems that the brains that possess the link are busy implementing it's isomorphic structure in technology, while the brains that do not possess the link are contributing nothing as they are still in a more "primitive" state? (Primitive meaning that they lack the linkage to see that the terms are really structurally isomorphic on the grand scale of things)
It would be interesting to know what input those brains that do not possess the link require to start possessing it.
Can there exist brains that will never make the link?
Until there's evidence to the contrary, the materialist world is the only world there is, you can simply call it the world as calling it the materialist world is redundant.
Materialism has ample evidence to support it; dualism has no good evidence, it is therefore dualism that is on trial, not materialism.
Why do you assume people are not doing the same?
> But humans really do semantics. Nobody questions this, or challenges it, because it is self-evident.
It is no more self evident than your claim that computers don't do semantics.
Your argumentation is circular:
IF computers can't do semantics, and IF humans do, then clearly the humans and computers must be fundamentally different. But your claim that computers can't do semantics (as opposed to, a claim that we don't yet know how to program computers to do semantics) is assuming that humans and computers are fundamentally different - if we are not, and the materalist hypothesis holds, there is simply no basis for assuming that we are fundamentally different (as categories; clearly current computers are not structured as human brains).
> It is easier to be an idiot, because that doesn't put your funding in jeopardy.
Less name calling, and more of an argument that isn't resting on a logical flaw might be preferable.