Is AI just search through a powerset of language? - headshock1111
======
yesAndKnow
Sort of.

If one defines language as simply a symbol for some thing, than yes, A.I. is
reducing the search space. But that's not exactly what happens in the mind.
The mind is actually not making the search space smaller, it's remembering how
searches are or are not effective. If it needs new knowledge, it remembers it,
if it has the wetware to do so.

But here is the thing: when I read Socrates, I can connect with the knowledge
that Plato transcribed in a way that is different than what A.I. does because
I am human. If an A.I. is to truly be considered to have 'human' intelligence,
it must be quasi-human. So I think it's important to discuss A.I. in terms
that are closer to something that augments human intelligence rather than
something that is independently intelligent. It's a little naive to think of
it as a language search, because then you might say that Google's search
algorithm is the most intelligent public A.I. agent on Earth. But Watson is
way smarter than Google's algorithm, and I'd wager there are even smarter
agents at work that we barely understand.

Likewise, language typically describes words that pertain to non science
related events such as the President's farewell address. But, even though a
linguist could rightly argue that his words are chosen with a certain
scientific specificity, the whole of language is scientifically driven in many
instances. So good A.I., were it to be as you posit, would reduce say, the
amount of molecular activity in a couple of trillion cells to a single act,
like walking, for instance, or riding a bike, or dialing a phone, or hammering
a nail. All of these things are easily understood by the human mind, but are
not so easy for A.I. because of the (importantly) complex activity involved in
virtually everything a human does. In fact, I cannot think of anything a human
does that isn't extremely complex in terms that would be fully comprehensible
by an inorganic agent.

But I think you're fishing a little, and that's cool, because everyone wants
to know what A.I. might or might not be, and we all have to learn as we go.

------
BjoernKW
From a human perspective, I suppose you could say that. Keep in mind though
that that still means an infinite search space given that (human) language is
either context-free or context-sensitive and therefore capable of generating
an infinite number of valid statements.

If you approach intelligence from a more general not-necessarily-human-like
level you'd have to accommodate a much larger set of elements to search
through. Think about those recent examples of AI Go players that apparently
use approaches in their game tactics which often don't seem to make sense to
human players.

AGI thinking might indeed appear very alien to us.

