Boost logo

Boost :

Subject: Re: [boost] [utility] new auto_buffer class --- RFC
From: Matt Calabrese (rivorus_at_[hidden])
Date: 2009-03-02 11:11:37

On Mon, Mar 2, 2009 at 10:29 AM, Thorsten Ottosen <
thorsten.ottosen_at_[hidden]> wrote:
> I'm more scared by a #define that at the 256 default.

Hmm? Why is that? On the contrary, I would say a #define here is simply good
practice. A raw 256 is only providing a default at a level where you have
little-to-no knowledge of what such a default should be. Allowing the
default to be overridden via a #define lets a user easily override that
value based on his own requirements, and do so without having to directly
modify any boost code (often just from command line arguments to the
compiler or from project settings in an IDE) and without having to use a
metafunction/template alias/wrapper to make a new default for his or her
domain. I see all of this as much more preferable to a strict default value
of 256.

I'm all about avoiding the preprocessor when there are better alternatives,
but here it seems to be the ideal solution and offers the most flexibility
at no cost.

Different users (or different library implementers)
> can make their own decision regarding this based on domain specific
> knowledge.
> For example, Boost.Format might conjecture that 99% of all strings are
> less than 350 chars, and so use
> OTOH, in my custom project where we work with NP-Hard graph problems,
> I know that N=64
> One might speculate if the default should be the number of elements of the
> stack buffer, or the total size of the stack buffer. If somebody
> wrote

This is precisely my point. At what point does it make sense for 256 to be
the default? IMHO, if you have little basis for the value other than that
you think some default should be there and 256 is a nice round power of 2,
it should at least be easy to customize by users of the library, otherwise I
question the usefulness of such a default.

-Matt Calabrese

Boost list run by bdawes at, gregod at, cpdaniel at, john at