
The Basic Ideas in Neural Networks (1994) [pdf] - sonabinu
http://www-isl.stanford.edu/~widrow/papers/j1994thebasic.pdf
======
kleer001
Whoa, cool, the parent directory is publicly accessible! And it's chockablock
full.

Of course it's indexed here:

[http://www-isl.stanford.edu/~widrow/publications.html](http://www-
isl.stanford.edu/~widrow/publications.html)

With gems like: A study of rough amplitude quantization by means of Nyquist
sampling theory (1956) I bet someone'll get a few days reading outta this.

~~~
Zorlag
I have seen somebody accessing a grading sheet from Stanford. The directory
was indexed on shodan. I wonder what that was all about.

------
tw1010
I wonder why it is that articles written from way back when very often seem to
have a way higher average quality than the stuff written today. I'm not sure
the sole reason is that the only thing that survives is the high quality
stuff, or because anyone can publish anything on the web, thus lowering the
bar. It feels like even the peak of what we find today is rarely as good as
the average found back then.

~~~
bluenose69
There is an issue of survivor bias, but people with decades of experience tend
to think that quality has been going downhill. If you're at a university or
institution that provides archives, try scanning journals an issue at a time.

I imagine that fields vary, but in the fields that I've worked in, the quality
has gone down through the decades. Part of this is increasing pressure to
publish, but another factor is the growth of fields. Growth works out very
well if it is natural, with brilliant professors attracting brilliant
students. But problems arise when deans and funding agencies judge professors
by how many people they supervise. The pool of applicants has to be pretty
deep, when professors are expected to supervise 3+ students each, and those
students are expected to start doing the same after they graduate.

Many research fields come from nowhere, grow exponentially for a while, and
then either decline or find a non-research way to reach a steady state (e.g.
entertaining undergraduates in service classes). The early stages have high-
quality work, or subsequent stages don't occur. But quality might be expected
to decrease in those subsequent stages, simply because the people doing the
work were not selected as much for their quality and the potential for
success, but rather to keep flow going in an academic pipeline.

It also seems reasonable to put all of this in the larger context of declining
levels of intellectual discourse and literacy through the years.

------
pdpfan
Rumelhart is the co-author of a multi volume work called "Parallel Distributed
Processing". PDP was the old term in use when Rumelhart and McClelland
repopularized artificial neural network techniques in the 80s. Interestingly
enough, rumelhart and McClelland were cognitive scientists by training, not
computer scientist ones.

~~~
f00_
I think the PDP books are super interesting, have you read Jeffery Elman's
Rethinking Innateness kind of a successor to PDP. Elman did some of the first
work on recurrent neural networks I think

Geoff Hinton, some of who's work was in PDP, was an experimental psychologist
by training.

~~~
woodson
Yes, Elman also worked on recurrent neural networks applied to language/text.
A lot of word2vec stuff doesn’t seem too far off when you read the articles on
clustering of recurrent (hidden) layer activations.

------
visarga
Shockingly easy to read, explained in plain language. I didn't remember neural
net articles being so accessible, especially old ones.

Hints for successful application are funny to read. Each and every word in
there has been the subject of tons of papers.

* spotted Hinton in references, hehe, he was famous when neural nets were a joke

------
david90
This explains the basic ideas really well without messing up with
implementations and frameworks.

------
deepwired
This is nice

