
An Exact Mapping Between the Variational Renormalization Group and Deep Learning - evanb
http://arxiv.org/abs/1410.3831
======
jamessb
There was some discussion of a blog post explaining this paper here a month
ago:
[https://news.ycombinator.com/item?id=9833584](https://news.ycombinator.com/item?id=9833584)

------
ssivark
I have a question for OP @evanb, or anyone else who's knowledgeable on this:
Are ideas from tensor networks (eg: how the relevant degrees of freedom are
organized), or even neural networks for that matter, applied to lattice gauge
theories? In the other direction, is it understood how to implement states of
gauge theories as tensor networks?

I'm aware of Elitzur's theorem (that there is no local gauge invariant order
parameter), but I'm afraid I do not know much about lattice gauge theories.

~~~
evanb
I don't know much about tensor networks, so I may not be the right person to
answer your question. Maybe that's answer enough? I would guess that it would
only be apparent how to do some kind of embedding like you're thinking about
after fixing a gauge, so that there would be gauge configurations that are
physically equivalent but not at all manifestly related to the network you're
interested in.

I do know that people think about tensor networks seriously as toy models for
quantum gravity. But that's quite outside my range of expertise.

------
sieisteinmodel
How do you establish an exact mapping to sth that is not exact, but only a
buzzword term under which different people collect different methods?

Well, not that important. It's only important that the buzzword appears in the
title!

~~~
dang
The problem with this comment is that it doesn't teach us anything. If the
article is wrong, it would be valuable to explain _how_ it is wrong in a way
that readers here can understand. But a sarcastic dismissal that leaves out
the substance merely adds negativity.

~~~
JadeNB
I think that, if a phrase is meaningless, it is enough to explain why it is
meaningless (because it proposes to make an 'exact' connexion between
something rigorously defined and something that is not) without having to
explain in what way the argument for it fails to fulfil that meaningless goal.

Nonetheless, it's surely also true that it would be nice to suggest a
constructive remedy; and, fortunately, one need not go farther than the
abstract to find (a better approximation to) the precise statement that the
authors are making:

> We construct an exact mapping from the variational renormalization group,
> first introduced by Kadanoff, [to] deep learning architectures based on
> Restricted Boltzmann Machines (RBMs).

~~~
losername
That was a case of not reading the link rather than asserting meaninglessness.
The abstract makes clear that deep learning as an umbrella term would even fit
RG, a completely unrelated concept from physics.

