
Does all AI need a learning component? - andrew-lucker
https://medium.com/@andrew_subarctic/does-all-ai-need-a-learning-component-5b2a82410499#.1pqofwrvz
======
mindcrime
The importance of "GOFAI" techniques was one of the themes in my talk at All
Things Open last year. I wanted to cover the "state of the art" in open source
machine learning tools, but realized that that just promoted FDD - Fad Driven
Development, so I spent a modest amount of time during the talk going over
some of the older ideas and pointing out how they still had relevance, and
suggesting that the Next Big Thing might involve some hybridization of GOFAI
techniques and probabilistic / machine learning ideas (including deep
learning). So I guess I'd basically agree with the author here. The only thing
I wish he/she would have done, is add some links to papers or whatever,
highlighting what they consider to be some of the important new results in AI.

There's definitely still plenty of work ongoing that falls under the "AI"
rubric, see the cs.AI category on Arxiv.org for evidence of that. Of course
some things get cross-posted with the ML category as well, so it's not all
"GOFAI".

[http://arxiv.org/list/cs.AI/recent](http://arxiv.org/list/cs.AI/recent)

~~~
andrew-lucker
Sorry for the lack of substance. I don't like to get too specific on Medium,
but rather just use it to dump general feelings and emotion. That seems to be
the community standard too.

Reposting to HN is different though. What gave me the original feeling of FDD
rather than full spectrum research is all of the things that don't get
reposted or show up on news. I am sure that people are still doing that
research, it just doesn't "have a hat on", as one of my professors would say.

~~~
mindcrime
_Sorry for the lack of substance. I don 't like to get too specific on Medium,
but rather just use it to dump general feelings and emotion. That seems to be
the community standard too._

No worries, I didn't intend that to be a heavy criticism. Just a passing
observation, that it would be nice to see some pointers to the things that you
are excited about.

~~~
andrew-lucker
My personal passion is linguistics, and I find this whole LSTM or probablistic
grammar stuff ridiculous. It would be nice to see some research on connecting
knowledge models rather than just trying to mimick speech like a parrot.

