
The Strange Loop in Deep Learning - hexrcs
https://medium.com/intuitionmachine/the-strange-loop-in-deep-learning-38aa7caf6d7d
======
language
_However, as we are all now beginning to discover, the employment of ‘feedback
loops’ are creating one of the most mind-boggling new capabilities for
automation. This is not hyperbole, [...]_

Always glad to see references to Hofstadter's work, but this sounds like
hyperbole. I don't think that mere feedback loops are "strange" enough to
invoke his name. I think the idea refers to loops in systems which are
heterarchical. I don't think there are any interesting crossings over
different layers of abstraction here.

Interestingly, these ideas have been around for quite some time. For example,
the work of Warren McCulloch (and Walter Pitts) predates Hofstadter's thought
in this space. Relevant passage from his 1945 article, "A Heterarchy of Values
Determined by the Topology of Nervous Nets":

 _" [..] three heterodromic branch­es link the dromes so as to form a circle
in the net which is distin­guished from an endrome in that it is not the
circuit of any drome but transverse to all dromes, i.e., diadromic. The
simplest surface on which this net maps topologically (without a diallel) is a
tore. Cir­cularities in preference, instead of indicating inconsistencies,
actually demonstrate consistency of a higher order than had been dreamed of in
our philosophy."_

~~~
mlechha
How do you know whether something is a hyperbole? When the author claims
something is not a hyperbole.

------
j7ake
Quoting Douglas hofstader to begin a deep learning article is akin to quoting
Einstein to begin an article on quantum mechanics...

Douglas Hofstader is against the currrent direction that artifice intelligence
is headed .

~~~
guscost
> akin to quoting Einstein to begin an article on quantum mechanics

This analogy doesn't work - Einstein _discovered the photoelectric effect_.

~~~
j7ake
Yes and hofstader is one of the pioneers of artificial intelligence but he has
since been disappointed with where the field has gone

~~~
guscost
Quantum Mechanics is a well-defined subject. "Artificial Intelligence"
certainly is not. You could say that the builders of the Antikythera Mechanism
were pioneers of artificial intelligence, long before Babbage, Lovelace, or
Turing. You could say that everyone from Grace Hopper to Herbert A. Simon to
Noam Chomsky was a pioneer of artificial intelligence. It doesn't really mean
anything.

If you take "Artificial Intelligence" to mean "Connectionist Modeling in
Silicon" or "Deep Learning" instead, then my point stands.

------
stupidcar
I've always thought that consciousness seems a lot less mysterious if you
consider it as a feedback loop within the neocortex. We know that the brain
takes raw sensory input and turns it, via a neural hierarchy of increasing
abstraction, into a high-level, integrated model. We also know that there is a
lot of additional neural connectivity, both across different sections of the
neocortex, and from higher levels back to lower ones.

If you consider this diffuse connectivity not just as a regulation method, but
as another form of sensory input, then it stands to reason that the brain
would form an additional self-referential hierarchy of abstraction and pattern
recognition based on its own internal state, and that this "sense", a literal
self-awareness, would be very much like what we experience as consciousness.

~~~
goatlover
Self-awareness isn't the fundamental issue with consciousness. It's the
colors, sounds, tastes, feels, etc. Why are those a problem? Because the
physical world isn't colored, doesn't taste like anything, doesn't feel like
anything, etc.

So somehow, a physical system - the brain, creates or is correlated with those
conscious sensations. Explaining this in terms of physics, function,
computation, math, etc is hard because those fields lack any such sensations.
An equation doesn't feel heat or see red, nor do atoms bumping around in the
void. So from whence do those conscious sensations come from? Do they
spontaneously emerge somehow from the complexity of a nervous system? Is the
right sort of flow of electricity result in the smell of a rose? Is it the
right algorithm that implements consciousness, if you just arrange the bits
just right?

~~~
ryukafalz
The best I've been able to come up with is that what we think of as "color" or
"sound" (and similar) are in fact our brain's model of those aspects of the
physical world. Of course you're not experiencing the physical world directly,
but your brain does have a representation of the world based on its sensory
inputs.

------
jagthebeetle
Don't recurrent neural networks (which have been around in several variants
for a few decades) incorporate feedback loops in the sense described in this
article?

I'm curious how a data flow graph could learn representations of itself and of
arbitrary human concepts, but I somehow suspect you'd end up with an
unilluminating generator of JSON strings with many of our current
architectures.

------
empath75
I think the next revolution in machine learning is when we start connecting
specialized neural networks to each other. A neural network of neural
networks, basically.

