Dynamic Programming: Chain Matrix Multiplication 30 points by HalcyonicStorm on Jan 31, 2013 | hide | past | web | favorite | 13 comments

 Thanks for this nice article. I love the option to switch between slideshow and contiguous views.I didn't understand the statement that`````` The boxes with hyphens will be ignored because matrix multiplication works from left to right, not the other way around. `````` (from Preliminaries 3: Cost table). Matrix multiplication is associative, so can be grouped as you like, and then reduced 'left to right' or 'right to left'. I think the relevant point is that the cost is symmetric, in the sense that the two orders of reduction give the same cost; so there's no need to fill in the sub-diagonal entries. Is that correct?Finally, I'm sorry to have to nit-pick that the plural of 'matrix' is 'matrices' (just one 'i'). At least you didn't use singular 'matrice'. :-)
 Thanks, I added that because the slideshow was rough if not viewed with a wide enough screen resolution. As for the left to right vs right to left, you can't fill the sub-diagonal entries because those switch the order of multiplying the matrices and will give you a different answer, ie. AxB != BxA. As for the terrible spelling, there's a reason why I do math :P
 I can't stand the fact that the author is using "X" and "*" as multiplication symbols when we have × and · provided by Unicode. It makes the text much uglier than needed.
 The post has been updated for your viewing pleasure. Thanks for the tip and thanks for reading.
 Awesome, thanks!
 Nice, but I feel I should point out that there are faster ways to multiply matrices than the naive O(n^3). I think we're down to O(n^2.3something). http://en.wikipedia.org/wiki/Matrix_multiplication#Algorithm...
 Agreed. I was talking about the naive method of multiplying matricies. I could make another huge blog post on matrix multiplication algorithms. I am a particular fan of solving Ax=B type problems...hmmm maybe that could be a future post.
 Fun fact: this is pretty much the same dynamic program as CKY, the parsing algorithm that can parse any context-free language in the appropriate normal form (only productions looking like A->BC or A->a) in time cubic on the length of the sentence.
 Wow, I didn't know that...thanks, I will look into it
 Nice article but I believe the implementation of the fibonacci sequence might be easier to follow. Its an overall simpler problem.
 "...dynamic programming, a fancy term for breaking a big problem down into subproblems which are easier to solve. (Note: Not the technical definition)"Is the author trying to redefine what Dynamic Programming means or is he just totally confusing its reader by actually using a DP algo but saying he's not using the technical definition!?I mean: he's basically saying that Dynamic Programming is "Divide and Conquer". I'm not sure I follow him.Especially seen that, contrarily to memoization (yes, "memoization", not "memorization", there's no 'r'), DP doesn't just compute and cache what's necessary but computes everything. In that DP really doesn't look like "a subproblem which is easier to solve".Dynamic programming has a very precise meaning:
 He just said that it wasn't the technical definition for the simple language used. The note is not directly related to the algorithm he then explains.Dynamic Programming can be seen as divide and conquer but using memoization to avoid recalculations.The "subproblems easier to solve" are not easier complexity-wise but easier in the sense that we are solving a smaller subinstance of the main problem.
 I'm glad you have my back, adcoelho :D . @martinced, as adcoelho stated, I gave that disclaimer because I wanted to make sure that the reader knew that I gave a very rough definition that I hoped would be easier to understand, but was not rigorous. If the reader wants the rigorous definition and proof of this concept, there already are many resources on the internet for that.

Search: