Hacker News new | past | comments | ask | show | jobs | submit login

Isn't modern ML based on differential equations?



Gradient descent is partial differentiation, and Lagrange multipliers come up in the context of nearest-neighbor searches. I’m almost positive most people just use it without actually understanding the theory behind why it works, though.


Sure, but only a handful of people really employ advanced calculus for building ML, while the vast majority of Data Scientist only use the ML algorithms implementations as a black box, without diving into its inner workings.


I realize that there is a certain conflict here, since I'm happy to program computers without understanding the physics of transistors, but your comment makes me nervous.


I imagine the DE part of ML is, practically speaking, imported from a library somewhere by practitioners.




Guidelines | FAQ | Support | API | Security | Lists | Bookmarklet | Legal | Apply to YC | Contact

Search: