Hacker News new | past | comments | ask | show | jobs | submit login

> All of the information of our world is contained in text.

This statement is false. There is a well known thought experiment called Mary’s Room the gist of which is that knowing all conceivable scientific knowledge about how humans perceive color is still not a substitute for being a human and perceiving the color red: https://philosophynow.org/issues/99/What_Did_Mary_Know

The experience of seeing red is an example of what is called “qualia”.

In Google AI systems that identify cats, birds, etc it is reasonable to imagine AI technology evolving towards systems that can discuss those objects at the level of a typical person. However with an AI based on text only there is no possibility of that. It would be like discussing color with a blind person or sound with a deaf person.




Mary's Room and qualia is totally irrelevant. I'm not asking if the computer will "feel" "redness", simply if it can pretend to do so through text. If it can talk about the color red, in a way indistinguishable from any other human talking about red.

In any case, at some level everything is symbols. A video is just a bunch of 1's and 0's, as is text, and everything else. A being raised on only text input would have qualia just like a being raised on video input. It would just be different qualia.


Like explained elsewhere here, it may be that our brains share a genetically coded "decryption key", and that many of the things we talk about are too poorly and noisily expressed for a purely text based computer AI to ever truly replicate the processes going on inside our brains. Insufficient data, simply put, and no way to get it.

It may sure look indistinguishable, but on the inside it just wouldn't be the same.


The Mary's Room thought experiment is garbage, if you ask me. You can't just assume your hypothesis and then call the result truth.

If you assert that a person can understand everything there is to know about the color red and then still not understand what it is like to see red, you have either contradicted yourself or assumed dualism.


They assert that Mary understands the physical phenomenon of red. That is, she understands photons and eye structure, and therefore knows that light of a particular wavelength will trigger these sensors in the eye and thereafter be interpreted as "red" by a brain. All the physical components necessary to produce and sense the color "red". But when Mary sees the apple for the first time, did she learn something more about "red"?

Also, it's a thought experiment. Some people will claim the answer to that question is no, she learned nothing. Others will claim that she did. It's that thing she learned beyond the physical that theoretically cannot be conveyed by science, or even possibly by language.


To further this comment, also see Tacit Knowledge, of which the very definition is essentially knowledge that cannot or is extremely difficult to transfer through words alone.

https://en.wikipedia.org/wiki/Tacit_knowledge




Join us for AI Startup School this June 16-17 in San Francisco!

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: