Hacker News new | comments | show | ask | jobs | submit login

this is literally the first thing taught in both of the machine learning classes I've taken from Prof. Ng at Stanford, so maybe it's the application of logistic regression more than the estimation technique itself that makes machine learning?



I think it's more that machine learning builds off of earlier work in statistics. If I remember correctly, that class discusses lots of other things that are used in ML but were invented elsewhere (gradient descent, maximum likelihood, matrix methods, etc.)

Arguably this is all just semantics (nature knows no stats/ML divide), but as a ML person I know this drives statistics folks crazy.


There used to be an earlier era of machine learning that wasn't as statistical. Ng, and most other current ML researchers, now heavily draw on mainstream statistics. It really does make sense to do logistic regression as the foundation for later stuff.

The terminology confusions, I think, stems from the earlier era of ML research.


What did the earlier era use?


In a case like this, perhaps a perceptron?


Which is Ng's class's next chapter after logistic regression. :-)




Guidelines | FAQ | Support | API | Security | Lists | Bookmarklet | Legal | Apply to YC | Contact

Search: