Hacker News new | past | comments | ask | show | jobs | submit login
Show HN: Boltzmann Machines in TensorFlow with Examples: RBM, DBM, AIS (github.com/monsta-hd)
201 points by monsta-hd on Nov 19, 2017 | hide | past | favorite | 11 comments



I remember writing the RBM in straight CUDA back in ~2008. Back then we had to do our tensor work uphill both ways against the gradient.


Why are RBMs so seldom used? What is their weak point? Based on Hinton's talk I thought it was a major breakthrough back then.


It was back then..we don't even support them in our framework anymore. Most folks use VAEs and GANs for generative models now. In general, they are just more stable.


How about DBN? It is also a generative model but why now rarely used?

ML researchers following trends, like doing end-to-end CNN for whatever problems, and using this LSTM and that GRU and that latest architectures; in similar fashion like web devs picking and dropping javascript frameworks.


A DBN is equivalently just stacked RBMs. Most folks just use stacked autoencoders for the exact same reasons you'd use a DBN.

Most "off the shelf" work people do is fairly simple transfer learning on an imagenet cnn.

That or if they do generative, most of the hype is focused on GANs now.


I don't have any commentary here other than to say that this is great work!


Thank you!


I am going to run this on my Chromebook asap!


Oh...wow. This is amazing!


Thank you monsta-hd!


Welcome :)




Consider applying for YC's Spring batch! Applications are open till Feb 11.

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: