On the other hand, try implementing a convnet in numpy, especially the backprop, and you might start feeling some of its "rawness" :-)
Something I've always hated about people that say "why do I have to write a backprop when TF does it for me?"
Here's why: you go to a company, they want you to incorporate machine learning into their c++ engine. Have fun using numpy, you said you knew machine learning right? implement backprop for me, you can do that right?
I'm developing a model (not exactly a convnet) that uses a correlation step. Because of the above problem and its resulting pure-python loops, I may have to cythonize or use the NumPy C API for the gradient evaluation. Do you know of any examples I could check out that implement partial derivatives w.r.t. a correlation (or convolution) in "raw numpy"?