Hacker News new | past | comments | ask | show | jobs | submit login

What sort of mechanism is being used for the learning step? To my eyes, it isn't gradient descent, or, if it is, it wasn't immediately visually obvious to me what gradient is being descended.

I also ask this because, again at least visually, it is definitely the case that quantum perceptrons have a different classification shape than conventional perceptrons, but the first question I'd ask is if that different requires the "full" quantum formulation, or if you could get that different shape from something less than exponentially complicated, and perhaps amenable to some other mathematical analysis possible in the classical regime. (See also the previous work done in challenging D-Wave on some of their work a couple of years ago where they "beat" classical algorithms, and within a week, the classical algorithms had been improved to parity, simply because nobody had ever really looked at them with an eye to optimization before.)




Yes, actually this is a gradient descent, a modified version of adagrad.

We do not know yet if this could perform faster or better than the classical machine learning. There are some fundamental difficulties on comparing two ML approaches, what's the criteria ? If it takes exponentially less time but with a lower accuracy, is that ok ? What if we need less data points to learn ?

If the shape is exponentially more complex, can we find a way of learning, we need at least some local convexity near the minima.

Sorry, more questions than answers, this is where we stand now in Quantum Machine Learning. We are able to do things but we do not have a clear view.


"Sorry, more questions than answers, this is where we stand now in Quantum Machine Learning."

No problem, I get it, and I appreciate your answers.




Guidelines | FAQ | Support | API | Security | Lists | Bookmarklet | Legal | Apply to YC | Contact

Search: