See for example: https://www.johndcook.com/blog/2010/01/19/dont-invert-that-m...
Block Krylov methods are a thing , but I haven't experimented with them yet.
Plus R. M. Gower is fantastically nice and enthusiastic so there's that
And thanks for the link !
If your goal is to solve say a LS problem, why not go for CGLS? http://web.stanford.edu/group/SOL/software/cgls/
Also, you can also use an approximate LU-decomp as a preconditioner for Krylov methods.
Similarly, Tykhonov regularization (solving for a range of slightly perturbed matrices "A + labda I" where labda is a parameter) are easily tackled using Krylov subspace methods by noting that the Krylov subspace is invariant under shifts like these. So only once an orthonormal basis must be found for the Krylov subspace, which can then be used for every lamda of interest.
Direct methods as Gaussian elimination with pivoting are proven to be stable. Iterative methods are not but can be a lot cheaper in computational costs. Also they can stop when a certain relative residual is reached, unlike direct methods.
If iterative methods like Krylov subspace methods where stable, then they could actually be seen as direct methods themselves, as the Krylov subspace has at most a dimension of N, where N is the number of unknowns; so after at most N iterations the solution could be extracted exactly from the search space. In practice this is not the case due to rounding errors.
Solving the full quadratic optimization problem for SVMs in basically impossible to do. You are forming an n^2 matrix, so I'm going to let you imagine what happens when n = 100 000.
Using people use either approximation methods ( Incomplete Cholesky, Nystrom ) or do it exactly but iteratively ( SMO, Pegasos... )
I'm implementing them for class right now so it's still fresh in my head haha
Don't get me wrong, having working code to play with is key, but when you don't fully grasp the concepts behind it, an explanation can become so valuable.
That being said, you've included names, so research can be done. Great work and I hope you're enjoying it!
That's not a whole lot, you're quick
One vital improvement suggestion to make that path attractive would be if the Jupyter notebook format were used. It would be easier to add more documentation and references.
But in any case, thanks for sharing!
idx = np.random.choice(range(n_features), size=self.max_features, replace=False)
this is for people who don't just want to tune parameters but build the whole thing from scratch
I can buy buy a pie all the fix-ins from a bakery, or I can buy the ingredients myself, and make it to exactly my liking. it may not be a professional.
I don't think I'll be implementing as many algorithms as you though, I should force myself to work on more projects outside my comfort zone.
One comment I have. in kNN, it is best to ensure that the neighbors list occupies O(k) space.
Delivering value trumps painting every day