
Is artificial intelligence going off the rails? - runesoerensen
https://theoutline.com/post/2373/is-artificial-intelligence-going-off-the-rails
======
AndrewKemendo
_> Everybody knows that my brain does not operate by having trillions of
examples._

This is where I disagree fundamentally with others in the field - though
Trillions is also an exaggeration.

In my opinion this needs a lot more study, but if you look at just the amount
of visual data that we process per hour it's staggering.

There is no coherent comparison between our human visual systems and digital
system in terms of image processing, but we can use some rough equivalents as
a starting point.

MIT researchers have shown that humans can process images that have an
exposure time of about 13ms [1]. Now, that is "recognition" of something that
has already been seen, but it does show that it is the minimum time to
discriminate visual features for the optical/visual system to "close the
loop."

If you assume that the 13ms per process "sampling rate" at 60Hz (where most
people start to see screen flicker) then you could very roughly translate that
into image processing at around 276,000 "images" per hour. Roughly
extrapolated you are looking at 1.4B "samples" per year.

I would imagine that a system with millions of years of evolution can do a lot
to make that visual pipeline REALLY efficient, even if you keep the same basic
learning structure. That's especially in comparison to our crude tools. I mean
some of our best DL training sets only have 60-80,000 images.

Again, these comparisons are not rigorous, but illustrate my point that it's
much more data than I think Hinton et al give credit to. A lot more needs to
be taken into consideration for this to be really studied. Like attention in
the visual field, how much cognitive power is put into peripheral vs focal
point etc...

[1][http://news.mit.edu/2014/in-the-blink-of-an-
eye-0116](http://news.mit.edu/2014/in-the-blink-of-an-eye-0116)

