
What tech calls “AI” isn’t really AI - raleighm
https://www.salon.com/2018/04/29/what-tech-calls-ai-isnt-really-ai/
======
jdietrich
> The AI industry has a terrible problem: once we create an algorithm for
> something it's no longer AI, it's just an algorithm.

[https://news.ycombinator.com/item?id=14178405](https://news.ycombinator.com/item?id=14178405)

~~~
eesmith
I'm not sure that's true. I remember reading Alex Martelli (famous in the
Python world) write about how people now consider methods based on Bayes
Theorem (ie, statistical learning) to be part of AI. His view in 2003 was:

> considering Bayes' theorem to be part of AI makes just about as much sense
> as considering addition in the same light, if "expert systems" had been the
> first context in which you had ever seen numbers being summed. In the '80s,
> when at IBM Research we developed the first large-vocabulary real-time
> dictation taking systems, I remember continuous attacks coming from the
> Artificial Intelligentsia due to the fact that we were using NO "AI"
> techniques -- rather, stuff named after Bayes, Markov and Viterbi, all dead
> white mathematicians (it sure didn't help that our languages were PL/I,
> Rexx, Fortran, and the like -- no, particularly, that our system _worked_,
> the most unforgivable of sins:-). I recall T-shirts boldly emblazoned with
> "P(A|B) = P(B|A) P(A) / P(B)" worn at computational linguistics conferences
> as a deliberately inflammatory gesture, too:-).

So we have an algorithm which wasn't considered part of AI by the then-AI
practitioners, because it didn't aim for semantic understanding, become part
of an expanded definition of AI.

~~~
FiatLuxDave
Since when is Viterbi dead?

~~~
eesmith
Indeed! Looks like Martelli made a mistake in that quote.

------
dvfjsdhgfv
This article is seriously undervoted. I know it's against the interest of
several stakeholders here, but admitting that what we're working on now has
little to do with AI could actually help us to direct our efforts in workings
towards true AI, if it's possible to reach. Instead, we call simple
classification "AI", even though it has much more to do with statistics. Let's
just leave it at "machine learning" and stop talking about AI as something
that already exists, especially in your products.

------
senectus1
this is what happens when marketing gets a hold of buzz words.

Same problem with the term "crypto" which now apparently means anything from
Encryption to Digital money.

Its super annoying.

------
walterbell
Like many words, AI can be redefined by economic winners, including
advertisers and GPU vendors promoting autonomous driving.

------
detaro
... for some definition of "AI" the author decided is the true one.

~~~
throwaway84742
As a practitioner in this field, I kind of agree with the author. Historically
“intelligence” was synonymous with “cognition” as a necessary ingredient for
the dictionary definition of the word: “the ability to acquire and apply
knowledge and skills”. To call the ML techniques we have right now
“intelligence” requires one to stretch the truth pretty significantly.

~~~
hshehehjdjdjd
Historically, all kinds of words meant all kinds of different things than what
they do now. A word means what people think it means. It is a symbol used to
communicate an idea. We can wish that prescriptivist interpretations applied,
but one thing we can’t seem to do is make that the reality.

~~~
oddlyaromatic
Normally I'd agree with this sentiment: language changes constantly, usage is
king. However we're in an era where people (often in the course of marketing
themselves or a product) intentionally move the goalposts of word meanings in
a way that's generally unhelpful to users of that language. So we describe
there is something called X that sounds very impressive and powerful. And
there is something else _kind of like_ the existing idea we called X, but it's
more mundane ... it's close enough that salespeople can get away with calling
it that, so they do, and the definition of X is now hopelessly muddled. I
don't know what to do about it, but this does not strike me as being the same
as the natural evolution of word meanings over time. I guess really it is
still one form of that. But maybe it's one that as a society we should be more
careful of - you can't, for example, call one medication by the name of
another for marketing reasons.

Maybe AI is not the worst example. But with most buzzwords that are not
legally protected, they just get marketed into meaninglessness in a way that
harms non-expert consumers.

