Hacker News new | past | comments | ask | show | jobs | submit login

Not sure what set off the down votes but we can teach a computer to recognize characters for application in OCR. We also learned those characters from being taught in school and reading bad handwriting. We teach computers the same way. How are they supposed to magically recognize them especially since they didn't invent whatever language it is?

Note that I'm referring to artificial general intelligence[0].

0. http://en.wikipedia.org/wiki/Artificial_general_intelligence




This was back in the 1990s, but I worked in what was basically a data entry company, where the processing was a mixture of scanned forms (Scantron style) and human key entry. The project I was on was image scanning the forms for other purposes, so we had a neural net handwriting recognition system that we were comparing to human key entry at a large scale - millions of documents.

What we found was that human key entry significantly outperformed the neural nets, even when the data was carefully handwritten in constrained boxes. Humans were so far ahead of the heavily trained neural nets that the software was basically unusable at that point.

Of course, that was nearly 20 years ago, and things have probably moved on quite a bit. But you can still see the basic problem in Captcha-style validation on web pages. Computers just can't be trained to recognize distorted text that humans can read pretty easily.



iirc, yann-lecun's conolutional neural nets already ouperform humans at mnist digit recognition, most us mail is auto sorted by those machines...




Consider applying for YC's Spring batch! Applications are open till Feb 11.

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: