Hacker News new | past | comments | ask | show | jobs | submit login
Cellular Automata as Convolutional Neural Networks (arxiv.org)
87 points by benibraz 47 days ago | hide | past | favorite | 14 comments



See also: "learning" Conway's Game of Life configurations by gradient descent.

https://hardmath123.github.io/conways-gradient.html


I wonder if it is because using backpropagation all non-linear functions are chained together when the weights are learnt? Is it naive to think by that formulation the results will be quite similar since the final model equations are close?


Excellent write up. I've coincidentally been experimenting with the same thing. Any idea whether this approach could be used to speed up a search for exact solutions?



Not sure I understand the importance of this...


Indeed, CA can be represented by simple combinations of boolean functions, obviously by NN also, which is a combination of similar nonlinear functions.


A wide enough NN can represent any arbitrary binary function, but it's not obvious that one can learn it.


Yet the best NNs are deep, not wide.


What do you mean by 'the best'? Deeper architectures are popular because they quiet easy to train. They do work well in practice on many tasks (especially vision) but they have their limits.

Infinite wide networks are a newly active field and has recently shown some promising results, theoretically [1, 2] and empirically [3].

[1] https://arxiv.org/abs/2001.06931 [3] https://arxiv.org/abs/1806.07572 [2] https://ai.googleblog.com/2020/03/fast-and-easy-infinitely-w...


> We show that any CA may readily be represented using a convolutional neural network with a network-in-network architecture.


I take it this is not a bidirectional mapping?


I was hoping to see some mention of rule 110


kind of obvious


Can you please tell us how is it obvious? I am interested in the network in network part




Guidelines | FAQ | Support | API | Security | Lists | Bookmarklet | Legal | Apply to YC | Contact

Search: