Boost logo

Boost :

From: Jim Apple (japple_at_[hidden])
Date: 2003-12-12 18:43:31


David Abrahams wrote:

>>>It might help some of us more ignorant folk if you'd post a definition
>>>of "memoization".
>>
>>and of dynamic programming too and how it differs from memoization :-)
>
> Especially if by "dynamic programming" you mean something other than an
> optimization technique for finding the least cost path through a
> search space. That's the only meaning I know for the term.

Sorry:

Here's a link to complement this stuff:

http://www.cs.dal.ca/~nzeh/Teaching/3110/DynProg.pdf

Dynamic programming/memoization is useful when solving a problem with a
recursive solution, but where many sub-problems overlap. The classic
example of this is computing the Fibonacci numbers, which have the
following recursive definition:

f(0) = 1
f(1) = 1
f(n) = f(n-1) + f (n-2)

The naive implementation is miserably slow: it's performace is
exponential. The smart way to calculate f(n) is to create a table,
building incrementally. The performance is now linear: f(n) is only two
lookups. f(m) must be calculated for all m < n before this happens, but
since each of these is constat time, performace increase is dramatic.
This is dynamic programming.

This concept is surprisingly useful. The "choose" function included in
the first email is another example where dynamic programming pays off
big. Other examples are:

* Optimal matrix multiplication order
* Longest common subsequence
* Finding all shortest paths between all nodes in a graph
* Divided differences
* Parsing
* A version of the knapsack problem (not the NP-complete one!)
* Optimal binary search tree

The creation of dynamic programming algorithms often begins with a
formulation of the problem recursively. This is not always trivial, but
is still simpler then the dynamic programming version that has explicit
loops and table entries.

Memoizing is a simple concept: remeber what you know. This creates the
table implicitly. This can be faster or slower than dynamic
programming, but when slower it is often only by a constant factor. The
"choose" example is linear with memoization, but quadratic with dynamic
programming.

In addition, memoization can keep answers betwwn (not just within) top
level/user space calls. This allows a certain flexibility in
programming - if I'm not sure how often I'll recall some_calc(21,11,99),
I can switch between memoizing it or not with a few characters of code.
  Cumbersome and buggy internal memoization is thus avoided. I can even
switch memoization on/off at run time.

Jim


Boost list run by bdawes at acm.org, gregod at cs.rpi.edu, cpdaniel at pacbell.net, john at johnmaddock.co.uk