Hacker News new | past | comments | ask | show | jobs | submit login
Ask HN: Is machine learning just glorified convex optimisation and statistics?
9 points by noob_eng on April 8, 2023 | hide | past | favorite | 15 comments
I was going through Prof Stephen Boyd's video lecture series on Convex Optimisation on youtube. I realised that it is basically what we call machine learning, popularly, with proper grounding in theory. Most machine learning courses do hand wavy explanations of the methods and teach step by step algorithms to work on data even in top schools. Why not teach proper theory based teaching like Boyd does? It will prevent practitioners from applying wrong methods to unsuitable datasets and arrive at false conclusions.



My pleb take is modern machine learning is just glorified complexity theory. Really all you are doing is solving hard problems in learning models by designing a continuous process, deterministic or randomized, showing it has desired properties of arriving at some optimal solution or probability distribution and then deriving a discrete algorithm that runs in polynomial-time because math optimization problems in general are NP-hard.

Now we have non-convex neural network models which require non-convex optimization which to me (again a pleb take) is just tricks of the trade from complexity theorists adapting the principles of convex optimization to things like gradient descent in deep learning by observing continuous local smoothness of the training objective at the stability edge thus some convex optimization can be used.

Why it's not taught instead of the confusing intro courses I'm sure have their reasons but it's another example of following what the universities teach in undergrad is not always the best road map for self-learners.


Where to learn these, if not taught at universities? Books?


Yes. It's optimisation and statistics. I wish to goodness AI had been called something like applied statistical heuristics or complex statistical problem solving.


Many ML methods optimize surfaces that are not convex. I mean you are not wrong because, yes, ML is Bayesian statistics, but claiming that that is all ML is, is reductive.


Not all of ML is optimization, in the sense of minimizing some error function via computing derivatives. Some methods, like decision trees, tackle the problem in a completely different manner.


ML / genetic algorithms / etc are indeed often alternative multi-variable optimisers.


In the same way that all of computer science is glorified Boolean logic.


That's a bit too harsh, IMHO.


The OP says nothing about ML architectures and meta parameters, just like boolean logic says nothing about algorithms. I think it's pretty spot on.


This is not even wrong.


To be fair - many people in this world are just money making and consuming meatbags with human rights.


this is like saying software engineering is just arithmetic


this was my dumb guy take when i was in math school


What’s your take now?


haha still the same, just the applied versions of those topics that you can run through a machine




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: