Hacker News new | past | comments | ask | show | jobs | submit login

Since this paper along with [1] both beat human level performance on ImageNet, using seemingly unrelated techniques, it seems like there's a clear path to achieve the next state of the art result: Implement and train a network that uses both Batch Normalization and PReLU's at the same time. Right?

1: http://arxiv.org/pdf/1502.01852v1.pdf




Exactly my thoughts. Performance of both incremental improvements seems similar, would be interesting to see what the combined effect would be. It's pretty clear that they're reaching the limitations of the data set, though.




Consider applying for YC's Spring batch! Applications are open till Feb 11.

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: