This is Jeff Erickson's free textbook (really a set of detailed lecture notes) on Algorithms, a course that Jeff teaches at UIUC and routinely wins teaching awards for. Definitely worth your time. Jeff is working on a similar set of notes for Computational Topology: http://www.cs.uiuc.edu/~jeffe/teaching/comptop/schedule.html. These are not as developed yet. He's also written an interesting piece about getting into a graduate school with low grades: http://3dpancakes.typepad.com/ernie/2005/03/re_phd_with_low.....
For computer science course notes available
on the Internet, these seem to be relatively
well written. It looks like there needs to be
a lot of thanks to at least D. Knuth's TeX
and some graphics packages and maybe also
LaTeX.
Cute: By the time computing got powerful enough
so that everyone could have TeX, LaTeX, and
something to write the output to PDF to create
Knuth's "beautiful books", computing also
enabled the Internet and wide use of PDF so
that the days of tree killer "books" are
nearly over!
Reading through the book quickly and mostly
just looking at the topics, he has a lot
there. E.g., he includes some on the
fast Fourier transform (FFT).
So, there is a question about selection of
content: Of course the FFT is a darned
cute, powerful, and valuable algorithm
(actually, it is awash in closely related
but still significantly different
versions -- can find more than one whole book
describing a collection of just
some of the more popular
versions).
And to some extent can look at that algorithm
as a source of how to do an algorithm and
also as an application of some somewhat more
general techniques in algorithms. But treating
the FFT as mostly just an algorithm is
a bit of a strain: Why? Because as soon as the
original Cooley-Tukey paper came out in the
1960s (yes, with some ideas that to one extent
or another had been published before going
back decades or so), there was a big splash
in military sonar, radar, and more. We're
talking a huge splash. It bought me a
nice high end Camaro!. I used to drive home
going south on Route 29 just south of the
JHU/APL in Howard County, Maryland and going
up one particular hill try to get to
100 MPH in second gear in the Turbo 400
automatic transmission! Nice car!
So, with the FFT could do a wide range
of digital filtering. It also got used
in X-ray crystalography. At least for
a while, Texas Instruments was going
to make a 'radio' receiver that would
have no local oscillator circuit and,
instead, would feed the antenna signal
directly to a very sensitive, very
fast, high resolution analog-digital
converter and then run that digital
output through the FFT as the means of
'tuning' to particular frequencies.
Then, of course, could 'listen' to
several different frequencies at once,
do spread spectrum tricks, etc.
Big splash.
But what's in the OP on the FFT is
essentially just the algorithm.
That's a little like making cake
batter but leaving out the cake,
icing, soaking with syrup with
Kirschwasser, whipped cream,
coffee, etc.! Really, the FFT
mostly belongs in an electronic
engineering course in signal
processing.
Then the book does quite a lot
with some discrete optimization.
There is a whole stream of such work
going back to Ford and Fulkerson,
the Hungarian method,
algorithms of Dijkstra and Bellman,
applications of the network simplex
algorithm, etc. So, there is
maximum matching,
minimum spanning tree, shortest
paths, maximum network flows,
least cost network flows, etc.
There is some question about teaching
this material, especially as
computer science: Really,
originally the material was mostly
optimization in operations research.
I know, operations research is nearly
a dead field, and optimization is
not far behind. So, just why should
computer science raise these topics
from the dead?
One old reason for studying these topics was,
at first glance, some of those problems
look like would need an exponential
algorithm that enumerated all
combinations to get an optimal
solution yet there are algorithms
that are worst case polynomial
or average case polynomial. So,
these problems and the associated
algorithms are a stake in the ground
of combinatorial optimization problems
that do have polynomial solutions.
So, maybe from this stake in the
ground, we can get some guidance
in our search to resolve P versus NP.
Maybe. And, yes, there are some
selected applications, maybe to
IP routing! Still, dragging all
UI CS students through this material
seems a bit questionable.
Then the book also does the simplex algorithm
of linear programming. Okay, it's
an algorithm, but really it's
the core technique in optimization
in operations research and is not
really discrete. His treatment looks
good for being so short, but, again,
there is some question about calling
simplex part of computer science and
dragging all UI CS undergraduates
through it. Besides, the UI
operations research, applied math,
or math departments could
charge the UI CS department with
academic theft! So, next, UI CS
will be teaching calculus? Linear
algebra? Art history?
Yes, in each case, the course is teaching
an algorithm, but not all of the
ones it is teaching are really of the
same kind as quicksort, heap sort,
merge sort, AVL trees, red-black trees,
extendible hashing, perfect hashing,
etc. The difference in "kind"?
When you've studied, learned, programmed,
timed, and used quicksort and thought
about how it does well with virtual
memory locality of reference, divide
and conquer, recursion, and an opportunity
to exploit multi-core processors,
then you are done with it. But for
the FFT, simplex, matroids
(which it also touches on),
are just getting a peak into
what in each case is often a
whole book or a whole shelf of
books and significant part of
a whole field, e.g., electronic
engineering, operations research,
combinatorial optimization research,
etc. Sounds like a way to keep some
UI CS undergraduates really busy
and sober and doing homework
seven nights a week!
Enders A. Robinson,
'Multichannel Time Series Analysis with Digital
Computer Programs',
Holden-Day,
San Francisco.
So, the main application here is finding
oil! So, go "boom" and send a sound
wave into the earth. At each different
layer, get back a reflection of part of
the energy. So, to get back just the layers,
just use the FFT.
Also have fun with X-ray diffraction,
holography, etc.
For signal processing, the core foundational
point is that a 'time-invariant linear
system' can be fully described by
Fourier theory. Basically the output
is a convolution of the input, and
the fast way to do a convolution is
with the FFT.
The impact of the FFT on signal
processing was enormous.
Yes algorithms courses are a bit of a grab bag. Really, whether you enjoy them just comes down to your interests. I loved learning algorithms because the important ideas can be conveyed without getting bogged down in applications. On top of that a lot of what I learned in CS will be useless in 20 years, but algorithms have lasting value.
Concise, elegantly simple, and precisely measured for student consumption over one semester (16-17 weeks).
Masterfully hidden from view are the edits, re-edits, discussions, Tex/Latex tutorials, file conversions, hand-written notes, lecture preps, student questions and a hundred other details that end in an 814 page treatise for humans to enjoy via that "other thing" they did at UIUC.
"Describe and analyze an algorithm that determines, given a legal arrangement of standard pieces on a standard chess board, which player will win at chess from the given starting position if both players play perfectly"
Ok, but [Hint: There is a one-line solution!]
Am I missing something obvious like "it's always a draw"?
You're missing that there is no need for the algorithm to be anywhere close to efficient. Just searching the whole game tree recursively will do it. Remember the draw rules in chess: if no one eats a figure for 50 turns or so, the game ends. Therefore, the game tree is actually of finite height, and the naive algorithm will eventually terminate.
Took this with Erickson back in Fall 2002 - go to all the lectures, they're worth it! And find a good study group, some of these homeworks are hard when learning them for the first time.