Hacker News new | past | comments | ask | show | jobs | submit login

It is a little old given how quickly the field moves, but 'Information Theory, Inference and Learning Algorithms' has a chapter on neural networks. It is an outstanding book: a labour of love from a very smart person. The exercises are varied, explanations are great, there's a sprinkling of humour, and connections drawn between multiple fields of study. Moreover, it is freely available from the author's website: http://www.inference.org.uk/itprnn/book.html

Have you considered giving Goodfellow another shot, but trying to re-derive the results therein as a form of exercise? I think that would likely be one of the faster methods to bring yourself reasonably up to date with the field.




I second this as well! ITILA is not just outstanding but offers a timeless perspective to statistical and uncertainty modeling, not to mention information theory which is often understudied in this field (including by myself).

As the parent poster says, this field moves fast but this book will give a solid grounding.

Even though the treatment on neural networks is short, the beginning chapters are worthwhile. The chapter one random variables and probability is one of the best introductions to probabilistic modeling which I’ve seen.


You can find video of MacKay's lectures covering parts of the book online. I'm more of a reader but I really enjoyed them. And the book is excellent.



second the Mackay book recommendation! It is a lovely text with a ton of examples, good reference for a senior course on applied probability.




Guidelines | FAQ | Support | API | Security | Lists | Bookmarklet | Legal | Apply to YC | Contact

Search: