

OCR and Neural Nets in JavaScript - option_greek
http://ejohn.org/blog/ocr-and-neural-nets-in-javascript/

======
bkanber
I'm writing an ML in JS series and I was excited to see another person talking
about neural networks in Javascript (an upcoming post of mine). I was a little
disappointed to see that the article doesn't talk about the actual neural net!

But still the message is clear: Javascript can do some damned cool things.

~~~
johnleppings
It's a turing complete language. Why is it surprising or note-worthy that you
can implement ML algorithms in it?

~~~
bkanber
Because up until relatively recently, JS was slow and crippled. The public
image of JS is that its only use is Ajax and making popups. It's only recently
being recognized as a "real language". Surprising and note-worthy that you can
do ML in JS? No. But novel? Yes.

------
drub0y
I really don't know why people are so surprised/amazed by this kind of stuff.
As soon as JavaScript had access to pure image data in the browser there was
nothing else holding it back from these kinds of algorithms. It was always the
sandbox that was holding it back, not the language.

Cool none the less...

~~~
petercooper
Bear in mind this post is from 2009 so it was a bit more amazing back then ;-)

------
slig
Related: <https://www.coursera.org/course/neuralnets> starting on 24 September
2012.

Does anyone here will do this class?

I'm currently doing the ML class, so I'm not sure.

~~~
option_greek
Signed up. I'm doing algorithms-1 course now - so not sure if I can complete
all assignments. Btw, how is the ML course ? Is it too math intensive...

~~~
Niten
I'm taking both of those also, and the math in ML isn't intensive... but there
is much more exposure to calculus and linear algebra in ML than in Algorithms
I, and while I agree with the other commenter who said Andrew Ng walks you
through it, an understanding of both helps with getting the "why" of things
like the gradient descent formulas introduced in the linear regression
lectures.

The ML course does start out with an optional linear algebra review lecture,
but it's very focused on mechanics rather than underlying mathematical
reasoning... for conceptual mastery (or even just brushing up) I'd recommend
going through Khan's videos on the subject, if you have the time.

------
taylorbuley
I've been working on a basic multi-layer perceptron network using IndexedDB. I
found IDB performance is not quite as fast as needed to pull it off with the
algos I was using

~~~
option_greek
Is there any specific reason you are using a DB in this case ? As I
understand, for most of the cases, the neural net data stays in memory and is
persisted to disk only to save and load the network.

~~~
taylorbuley
For me the big win is if this can be done purely on the client side. I see two
main advantages: 1) the user doesn't have to give up her privacy in exchange
for recommendations 2) the service doesn't have to burn CPU it would have had
to otherwise

