
Boost : 
From: David Abrahams (dave_at_[hidden])
Date: 20031212 20:29:33
Jim Apple <japple_at_[hidden]> writes:
> David Abrahams wrote:
>
>>>>It might help some of us more ignorant folk if you'd post a definition
>>>>of "memoization".
>>>
>>>and of dynamic programming too and how it differs from memoization :)
>> Especially if by "dynamic programming" you mean something other than
>> an
>> optimization technique for finding the least cost path through a
>> search space. That's the only meaning I know for the term.
>
> Sorry:
>
> Here's a link to complement this stuff:
>
> http://www.cs.dal.ca/~nzeh/Teaching/3110/DynProg.pdf
Still no definition of memoization, and the distinction from dynamic
programming given the examples is nonobvious. Nearest thing I can
find is that it's topdown dynamic programming, but I guess I never
thought of dynamic programming as being related to a particular
search order.
>
> Dynamic programming/memoization is useful when solving a problem with
> a recursive solution, but where many subproblems overlap.
Don't all problems with an iterative solution have a recursive
solution? I never think of dynamic programming that way, though it
appears we're talking about the same thing.
> The
> classic example of this is computing the Fibonacci numbers, which have
> the following recursive definition:
>
> f(0) = 1
> f(1) = 1
> f(n) = f(n1) + f (n2)
>
> The naive implementation is miserably slow: it's performace is
> exponential.
You'd have to be *pretty* naive to write it that way.
> The smart way to calculate f(n) is to create a table, building
> incrementally.
I guess, if you need to calculate f(n) for many different
nonconsecutive n, that's smart. If you only need to do it once you
just:
unsigned fib(unsigned n) // linear, no table.
{
if (n < 2) return 1;
unsigned n1 = 1, n2 = 1;
for (unsigned i = 2; i < n; ++i)
{
unsigned x = n1 + n2;
n2 = n1;
n1 = x;
}
return n1 + n2;
}
> The performance is now linear: f(n) is only
> two lookups.
That's constant, not linear.
> f(m) must be calculated for all m < n before this
> happens, but since each of these is constat time, performace increase
> is dramatic. This is dynamic programming.
Well, not a perfectly worded example, but I now know we're talking
about the same dynamic programming.
> Memoizing is a simple concept: remeber what you know. This creates the
> table implicitly.
Example? I always think of dynamic programming as creating the cache
implicitly, too.
 Dave Abrahams Boost Consulting www.boostconsulting.com
Boost list run by bdawes at acm.org, gregod at cs.rpi.edu, cpdaniel at pacbell.net, john at johnmaddock.co.uk