Boost logo

Boost :

From: Beman Dawes (bdawes_at_[hidden])
Date: 2006-03-05 21:21:34


"John Maddock" <john_at_[hidden]> wrote in message
news:006a01c6403f$b0760640$caed1b52_at_fuji...
>>> That's an implementation detail. It isn't required by the spec,
>>> although that may be the most obvious way to implement the spec. An
>>> alternate
>>> implementation would be to keep a pool of directory entry objects and
>>> recycle them if performance was a concern. It would be great if
>>> Boost had a cache library to make such a strategy trivial to
>>> implement.
>
> What kind of cache did you have in mind? Regex has an "Object Cache" (see
> boost/regex/pending/object_cache.hpp) that I always meant to submit for
> full
> Boost status but never got around to :-(
>
> It simply maps a "key" to an instance of an object constructed from the
> key:
> if the object already exists it returns it, otherwise it constructs a new
> one and caches it. Objects are returned as a shared_ptr<const Object>
> (the
> shared_ptr is needed for memory management, since neither the cache not
> the
> client have exclusive ownership on the object). Useage is simply:
>
> shared_ptr<const Object> obj =
> object_cache<Key, Object>::get(
> my_key,
> max_number_of_objects_to_cache);
>
> To be honest usefulness is pretty limited, and threading issues add a
> small
> performance hurdle, that means that "Objects" have to be fairly expensive
> to
> construct before it becomes useful.
>
> Still, there you go ;-)

Rather than a cache that does A or a cache that does B, I'd rather see
someone do a review of various cache features in the literature (or at least
whatever Google can find), identify the most useful ones, and then design
one or more classes (probably template based) that delivered that useful
feature set.

For example, "Least Recently Used", "Least Frequently Used", and "Least
Recently Added" are common replacement policies, but there are probably some
others that are occasionally useful. I've used a scheme where entries are
never flushed as long as there are any live references to them.

Another way to approach a cache library design would be to collect a set of
use cases, and then see what feature set would be needed to satisfy those
use cases.

Caching is one of the fundamental ways to achieve improved performance. I'm
a bit surprised it hasn't been more of a topic in Boost discussions, or in
computer science in general.

That isn't a criticism of your object_cache; it is just that I think the
topic deserves more than a single-solution approach.

--Beman


Boost list run by bdawes at acm.org, gregod at cs.rpi.edu, cpdaniel at pacbell.net, john at johnmaddock.co.uk