
Why the name "dynamic programming"? - blasdel
http://arcanesentiment.blogspot.com/2010/04/why-dynamic-programming.html?utm_source=feedburner&utm_medium=feed&utm_campaign=Feed%3A+ArcaneSentiment+%28Arcane+Sentiment%29
======
jsyedidia
"Dynamic programming" may not be the best name, but it's well-established, and
there's always a cost to proposing another name: fragmentation of
understanding. It's better that everybody continue to use the same non-optimal
name than having different groups of people using different names for the same
concept, and not realizing they're talking about the same thing.

Therefore, I think it's best to adhere to the rule, which is in fact generally
followed by scientists, that only researchers who publish something new should
propose new names.

In fact, "dynamic programming" already does already have a lot of other names.
For example, certain versions of the "belief propagation" algorithm might be
considered to be a form of dynamic programming, and the "belief propagation"
algorithm itself has versions that are called the "forward backward algorithm"
for Hidden Markov Models, the "Viterbi algorithm," the "sum-product
algorithm," the "turbo-decoding" algorithm, the "Kalman filter," and the
"transfer matrix." The reason there are so many names is that the "belief
propagation" algorithm is a great algorithm that is often optimal, so it kept
being reinvented.

It's not as bad as it seems though; an important advance that has occurred
over the last decade is that researchers from very different communities have
all started to describe these essentially similar algorithms using the same
visual language of "graphical models." To learn more about that, see the book
on "Information, Physics, and Computation," by Mezard and Montanari, the book
on "Probabilistic Graphical Models," by Koller and Friedman, or my own
articles ;-).

------
naveensundar
It seems that the author has confused memoization with dynamic programming.
You can use memoization in DP but need not use it always (using it is just
more efficient). DP is much more than memoization. DP lets you solve a certain
class of problems. For example problems see
<http://people.csail.mit.edu/bdean/6.046/dp/>. The chapter on DP from
Kleinberg's book (<http://www.aw-bc.com/info/kleinberg/index.html>) is free

<http://www.aw-bc.com/info/kleinberg/assets/downloads/ch6.pdf>

------
jallmann
Memoization is a form of dynamic programming, but dynamic programming is not
just memoization.

~~~
pkrumins
Memoization actually is not a form of dynamic programming, it's just a
technique used in dynamic programming to avoid recomputation.

~~~
nostrademons
You could argue that what dynamic programming does isn't even really
memoization in the sense that it's used in the Lisp/FP world.

Dynamic programming algorithms usually work from the bottom-up: you fill a
table with the first stage of the algorithm, and then compute subsequent
stages until you've converged on a solution. Memoization almost always works
from the top-down: you start with a naive recursive algorithm, and then store
function call results in a hashtable or other data structure to avoid
recomputing them.

They both involve avoiding recomputation by storing intermediate results in
memory. But saying that this makes them the same is sorta like saying that
recursion and iteration are the same because they both use the program
counter.

~~~
barrkel
Recursion with a tail call and iteration are the same, though. The difference
between tail recursion and iteration is whether or not the target of a jump
happens to be the start of a function or not.

~~~
nostrademons
They're the same in terms of what the machine is doing, assuming your compiler
does tail-call optimization. That's not the same as being "the same" - they
present a very different abstraction to the programmer. That was the point of
my analogy.

You can see this difference by looking at one of the ways this abstraction
leaks. Consider stack traces. In a sane language, you expect an error to give
you a stack trace of functions that have been called. If you treat recursion
as iteration, then either you'll have to omit some function calls from the
stack, or your algorithm won't work in constant space anymore. Either violates
some expectations of the programmer.

I also said "recursion" and not "tail recursion". There are some algorithms
that are impossible to implement without an external stack under iteration
(eg. tree traversal), and others where you _need_ to use iteration because a
stack is not appropriate (eg. breadth-first search).

------
cloudkj
The way I think about it, in the simplest form, dynamic programming
essentially boils down to a way of solving a problem with overlapping
subproblems and an optimal substructure. Memoization is a technique that can
achieve similarly efficient results, but is more of an "addendum" to
traditional recursive formulations to avoid the cost of solving overlapping
subproblems. The two are related, but not quite the same.

------
ezyang
I had a friend describe it to me as a class of algorithms that involve
"filling a table." Not 100% correct, and certainly not catchy, but it's
definitely help guide my understanding of the practice.

~~~
argv_empty
_Not 100% correct, and certainly not catchy, but it's definitely help guide my
understanding of the practice._

I had the opposite experience with this description. I could see that there
was a table being filled in, but it didn't give me any understanding of what
the table gets filled with and why. My capability with it remained cargo-
cultish until I had seen memoization isolated and discussed explicitly.

~~~
ezyang
Interesting! I always thought the table metaphor really appealed to a "pencil
and paper" mentality, which would be all about computing "more and more pieces
of the table" until it was full.

~~~
argv_empty
It does. The problem is that it doesn't really explain why what you're filling
in on the table is correct. Learning algorithms by rote was possible, but
devising them for problems I hadn't seen before was not. Now, I don't need to
memorize -- reconstruction is easier.

------
frou_dh
That's it - there's definitely a phenomenon of earnestly linking to Wikipedia
while belittling it at the same time.

------
kevinh
I find that dynamic programming sounds less threatening than memoization. When
I learned dynamic programming, my first thought was, "damn, that sounds cool.
It's dynamic!" Memoization would have been a foreign, and confusing, word at
that point.

~~~
pkrumins
That article is just wrong, forget everything it said, memoization is not
dynamic programming.

------
richcollins
I always thought of it as equivalent to late binding. I'm not sure what it has
to do with memoization.

~~~
alexkay
You are mixing dynamic programming with dynamic typing.

~~~
eru
The situation is even worse with lambda calculus. Lambda calculus should have
been named function theory, but that term was already used by an interesting
part of analysis, that should have been named complex analysis.

~~~
Darmani
I don't know why you think that; it's really, really important that lambda
terms are not functions. Bertrand Russell would explode if I tried to write
f(f) for any function f.

~~~
eru
Just used typed lambda calculus if you want to avoid that.

------
10ren
for the meaning:
[http://en.wikipedia.org/wiki/Dynamic_programming#Dynamic_pro...](http://en.wikipedia.org/wiki/Dynamic_programming#Dynamic_programming_in_computer_programming)

------
joubert
I wonder what "static programming" would be.

