
Recurrent neural networks, dreams, and filling in - pakl
http://blog.piekniewski.info/2016/12/05/recurrent-dreams-and-filling-in/
======
felippee
Just to note: although the models discussed in this blog can be arbitrary deep
and rely on multilayer perceptrons, the principles of learning are completely
different then in other contemporary deep learning: the error is injected at
every level in a distributed way. There is no vanishing gradient, because
there is no necessity to propagate anything end to end.

------
socmag
very interesting work...

so i was wondering, as far as "fill-in".

I see this is mainly focused on image synthesis for information that literally
isn't available.

OTOH, occluders happen a lot in the real world just temporarily as things move
around in front of us, or we move past an occluder.

In these cases we may have seen what is actually there just moments before the
occluder appeared.

So it might be interesting to create a version that fills-in from what it had
previously just seen rather from a corpus.

This way, if I'm shooting a video and there is an annoying lamp post that
suddenly gets in the way it can be instantly erased. This would seem really
useful for autonomous vehicle work as well.

I liked the ideas on sleep at the end as well w.r.t. attractors. I've thought
quite a bit about "stuck target vectors" in the context of boid simulations.
Really might be onto something there. Could have a loong chat about that.

Anyway, it's really cool, and great blog!

~~~
felippee
To some extent it does fill in based on what it has seen before. Although
there is a fair number of parameters in that model, it cannot memorise the
entire clip. This is apparent, since the "dream" sequences cannot reproduce
the entire sequence and compress/stretch/repeat certain subsequences or
collapse into a fixed point.

So the fill in does have some temporal context of what was there recently. I'm
running a bigger model now (great thing about PVM, it scales seamlessly) which
should provide a better quality image.

~~~
socmag
great! looking forward to the results. good stuff.

