Yes, k-nn is theoretically the one of the best ML algorithms in the sense that it will find the closest items in the training set. For classification or finding similar looking items it is great. However, it has pretty poor running times for evaluation of unseen data (http://nlp.stanford.edu/IR-book/html/htmledition/time-comple...). This is contrary to something like neural networks, which take a while to train, but then evaluate very quickly. For real world use the training times matter to an extent, but in a web app or real time application the latency from knn is just impractical.
That's why we developed the "Boundary Forest" algorithm which is a fast nearest-neighbor type algorithm with generalization at least as good as K-NN, while being able to respond to queries very quickly.
It maintains trees of examples that let it train and respond to test queries in logarithmic time with the number of stored examples, which can be much less than the overall number of training samples. It thus maintains k-NN's property of very fast training time, and is also an online algorithm, and can be used for regression problems as well as classification.
I wouldn't call it theoretically the best. It is affected by outliers and doesn't make any generalization at training time. This latter point raises the questions whether it deserves the name learning. I would say linear models are typically a better learning algorithm; I wouldn't know what to call "the best" algorithm, but it might be deep learning nowadays.