

Smoothed Analysis of Algorithms (alternative to worst-case and average-case analyses) - amichail
http://www.cs.yale.edu/homes/spielman/SmoothedAnalysis/framework.html

======
carterschonwald
Awesome, this is my advisor's webpage! (a better link would be)
<http://www.cs.yale.edu/homes/spielman/>

One of the main amazing results you can get using this sort of technique is
the first simplex style randomized algorithm for which you can prove a
polynomial time worst case bound on the run time. You can also use it to prove
nice bounds for machine learning and graphic techniques which otherwise have
pretty pessimistic bounds but in practice do behave nicely

~~~
pierrealexandre
I may be totally mistaken, but doesn't the result depend on X and Y unit (or
alternatively on the standard dev of the gaussian) ?

~~~
carterschonwald
Good question, so the heart of smoothed analysis is that you need your problem
to have the property that there is some sort of small perturbation you can
apply to the input data such that there is accordingly only a similarly small
shift to the answer the algorithm computes. Once you have that determined, you
then try to show that with high probability the runtime of the algorithm on
the randomly perturbed data is polynomial on the input size.

So in a certain sense, the strength of smoothed analysis is that by applying a
small amount of noise to a problem (which you can sometimes argue is simply
the process of solving it with fixed precision etc), you can destroy any
fragile counterexamples to a good execution speed.

Theres a bit more going on, but thats the basic idea

