
Ask HN: What is the missing ingredient in machine learning? - hellofunk
It was pointed out recently that as powerful as neural networks are -- they seem to be capable of learning nearly any type of pattern recognition -- they still require many orders of magnitude more training samples than a real biological mind that can grasp important patterns and relationships with very little trial and error or observation.<p>I&#x27;d be curious was musings anyone has as to the possible &quot;technical&quot; tool in a human mind that has this remarkable ability. A nice byproduct of AI research is that it makes us more able to respect and marvel at our own minds.<p>My guess is that our brains do require extraordinary data sets to perform good pattern recognition and learning, but that past generations pass these data sets down in the form of instincts and other genetic information, rather than rely on &quot;runtime&quot; behavior to &quot;train&quot; our &quot;neural nets.&quot;
======
stray
Our brains were written in Common Lisp with all types declared and (declare
((safety 0) (debug 0) (speed 3)) -- that's obviously a joke, but maybe there's
a kernel of truth in there somewhere?

::shrugs::

Our brains are also constantly making novel micro-theories about the temporal
relationship between things. Aw hell, it's constantly making novel micro-
theories in general.

But mainly I think, we have a big list of shit we never quite figured out --
and whenever we make some new connection, a background process tries to find a
fit with those unsolved mysteries.

In fact, _my_ brain is constantly trying to work out a cartesian product of
all types of problem with all types of solution. Even in reverse -- which has
resulted in sometimes spooky insight.

Sometimes the results make me laugh at odd times.

Maybe someone needs to find a way to reward AI for finding interpretations or
theories that we would find funny. Maybe it's just as simple as making an AI
that isn't a joyless fuck.

