

Algorithms notes from UIUC - pm90
http://www.cs.uiuc.edu/~jeffe/teaching/algorithms/

======
keporahg
I've been meaning to get around to reading those notes. A friend of mine also
told me about this course: <http://www.cs.sunysb.edu/~skiena/392/> \- taught
by Steven Skiena (author of The Algorithm Design Manual).

I wish my university had more courses like this, but for now I'll have to
learn from these.

I never really learned anything about pure algorithms in high school, but it
seems like quite a bit of people in my program covered the basics at their
school. I want to learn more about this style of programming so that I can
compete on sites like TopCoder, but I never know where to start. I'm guessing
these notes are good, but now I just need to find the time to study the
material.

------
tikhonj
The notes for the algorithms class here at Berkeley were extracted into a
pretty good textbook. You can get a draft of it here:
<http://www.cs.berkeley.edu/~vazirani/algorithms.html>. It's worth a look if
you're interested in basic algorithm concepts. (The book is for the
undergraduate introduction to algorithms class and offers a good overview of
the subject, but nothing super advanced.)

While it was published as a book, it really is just a neater version of the
lecture notes for the class.

~~~
zvrba
Why do algorithm books generally devote so much space to FFT? It seems like it
has very narrow applicability: apart from signal processing and multiplying
big integers or polynomials, what else are they useful for?

Wouldn't it be better to explain divide-and-conquer with some other algorithm
having broader applicability?

~~~
ot
I've always had the feeling that FFT is taught in the wrong way, at least in
CS courses. They just introduce complex roots of unity and this magic
Vandermonde matrix and then they state a scary "convolution theorem", and
after a lot of handwaving you see that you can use it for polynomial
multiplication. This gives basically no insight about _why_ things work.

With just a little bit of linear algebra and abstraction it is easy to gain a
lot of insight about Fourier transform. For example this is the way I see it:

* First you introduce shift-invariant linear operators (circulant in the finite-dimensional case), i.e. linear operators that don't change if you shift the vector, apply the operator and then shift it back.

* Then you show that the Fourier basis is THE basis that diagonalizes shift-invariant operators. This is just matter of multiplying the Fourier vectors with the shift operator and noticing that you get back the same vectors, and it is where the cyclicity of the roots of unity comes into play. This also explains _why_ Fourier transform works for signal processing: if you have to operate in the frequency domain, shifting your signal in time shouldn't change anything.

* Then you show that convolution (or polynomial multiplication) is shift-invariant: this is just matter of writing down a convolution and doing a change of variable

* You are done: if convolution is diagonal in the Fourier basis, it can be applied just by multiplying its coefficients in the Fourier domain.

* _Now_ you introduce FFT as an efficient way of computing Fourier transform, but only after you understood what the transform is doing.

When I realized this, something "clicked" in my mind and I finally
"understood" Fourier analysis.

Maybe I should expand on this a bit more and write a blog post about it
without all the complications that I find in books.

------
aoe
On a related note, are there any good video lectures on Algorithms?

~~~
pmb
MIT OpenCourseware with Leiserson and Demaine.
[http://ocw.mit.edu/courses/electrical-engineering-and-
comput...](http://ocw.mit.edu/courses/electrical-engineering-and-computer-
science/6-046j-introduction-to-algorithms-sma-5503-fall-2005/video-lectures/)

------
JoelPM
I took CS-373 with Prof Erickson ten years ago. It was hard. But he was also
possibly the best lecturer I had while in school.

