Hacker News new | past | comments | ask | show | jobs | submit login

There is no agreement on the exact meaning of ML and AI. They are often used interchangeably. And for good reason, because it's all about getting computers to learn. We should not squabble about semantics.

Can you point towards papers that report substantial GPU acceleration on what you call "non-deep ML"?




You're certainly right that AI/ML is often used interchangeably, though I think the distinction is pretty clear among practitioners: AI is the big circle, ML is a subset of AI, deep learning is a subset of ML. Symbolic AI is also a subset of AI, though I'm not familiar enough with it to say how much it interects the others.

So ML is AI, but just a small part of it.

As for papers, here's one showing GPU speedups for the 3 big GBM packages: https://arxiv.org/pdf/1809.04559.pdf. Their setup shows a 3-7x speedup for XGBoost.


Let‘s not forget Gaussian processes: https://gpytorch.ai/, https://www.kernel-operations.io/keops/index.html Some of the work I‘ve been doing in this field only became feasible in a reasonable amount of time due to the use of GPUs.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: