Hacker News new | past | comments | ask | show | jobs | submit login

I second this. To further illustrate that semantics is far more than syntax and that emotions are an inseparable part of understanding language, do the following experiment: take any word or concept and look up its definition in a dictionary. Take note of the words used in the definition and look up the meaning of those words. Repeat recursively and if you go far enough down the tree you will find that all words lead to self-recursive definitions that state "it is a feeling of" this or that. This is proof that semantics of language - and intelligence required to grasp it's semantics - is inseparable from subjective states.



What does the experiment prove? Of course dictionaries are circular. But that doesn't mean they don't contain huge amounts of information about the world, and about language. Information an AI could infer without any interaction with the outside world at all.


This means that the semantics of the language is rooted in subjective states. Restated, this means humans "understand" language because of humans' emotions. Computers may "understand" language too, but it surely will not be due to the subjective states as it is with humans. If we define AI as a computer that must "understand" language the same way as humans do, then by definition, AI is not possible.


This is why Turing invented his famous test. At the time people were arguing about what it would take to prove a machine is intelligent. People argued that the internal properties of the machine mattered, that it needed to feel emotions and qualia just like humans.

But Turing argued that if the machine could be shown to do the same tasks that humans can do, and act indistinguishable from a real human, then surely it's intelligent. Its internal properties don't matter, at least not to judge intelligence.


Exactly. The computer will fail such a test where such tests can only be passed if and only if the agent under test experiences subjective states. Since humans can always craft this class of tests, and the computer cannot pass it, it will always fail the "Turing Test".


What test could possibly test for subjective states? You can ask the computer how it feels, and it can just lie, or predict what a human would say if asked the same question. There's no way to know what the computer actually feels, and it doesn't really matter for this purpose.


The easy answer is this: these tests exist. Since no computer put to the turing test has passed, simply look up the test and observe how humans have induced the computer to fail.

In practice, a good class of tests to use is a test that must evoke an emotional response to produce a sensical answer. An example is art interpretation. Questions involving allegory. Interpret a poem etc.

Important to note that whatever the challenge is, it must always be a new example - as in never been seen before. Anything that is already in the existing corpus, the computer can simply look up what is already out there. In other words, there is no one concrete thing you can use again and again repeatedly.

Example of test that would foil a computer: A personally written poem and having discussion about it.




Registration is open for Startup School 2019. Classes start July 22nd.

Guidelines | FAQ | Support | API | Security | Lists | Bookmarklet | Legal | Apply to YC | Contact

Search: