Hacker News new | past | comments | ask | show | jobs | submit login

Yes, actually this is a gradient descent, a modified version of adagrad.

We do not know yet if this could perform faster or better than the classical machine learning. There are some fundamental difficulties on comparing two ML approaches, what's the criteria ? If it takes exponentially less time but with a lower accuracy, is that ok ? What if we need less data points to learn ?

If the shape is exponentially more complex, can we find a way of learning, we need at least some local convexity near the minima.

Sorry, more questions than answers, this is where we stand now in Quantum Machine Learning. We are able to do things but we do not have a clear view.




"Sorry, more questions than answers, this is where we stand now in Quantum Machine Learning."

No problem, I get it, and I appreciate your answers.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: