
On the Expressive Power of Deep Learning: A Tensor Analysis - groar
http://arxiv.org/abs/1509.05009
======
news_1776
tldr, anyone? I'm interested in the field and have been learning about it, but
I don't really understand this paper.

~~~
hpenedones
The tldr is in the abstract: "In deep learning terminology, this amounts to
saying that besides a negligible set, all functions that can be implemented by
a deep network of polynomial size, require an exponential size if one wishes
to implement (or approximate) them with a shallow network."

~~~
tgflynn
That's a very interesting result (assuming it's correct).

It certainly agrees with intuition based on analogy to boolean circuits where,
for example, the parity function requires exponential circuit size for shallow
circuits but only linear size for deep circuits, but I haven't heard of a
proof of this for NN's before.

