Subject: Re: [boost] Alternative implementation for BOOST_PP_VARIADIC_SIZE
From: Paul Mensonides (pmenso57_at_[hidden])
Date: 2011-11-14 22:58:24
On Mon, 14 Nov 2011 19:13:17 +0000, Jens Gustedt wrote:
> Paul Mensonides <pmenso57 <at> comcast.net> writes:
>> You cannot detect emptiness in the general case (even excluding
>> pathological input such as unbalanced parentheses). This is simply a
> could you please give us some arguments for such a strong statement?
> P99_IS_EMPTY does at least pretty well, and contrary to Edward Diener's
> false assertion somewhere in this thread, it does well with function
> like macros
(Excluding the pathological case of a unbalanced parenthesis.)
You can make it work with input terminating in a function-like macro at
the cost of making it not work for other things. As with any sort of
predicate-like detection, you have to somehow interact with the token
sequence of the argument in a way that either yields 1 or 0 (or whatever
the equivalent of true and false are). The problem ultimately comes from
the fact there is very little one can do to interact with the argument in
a way that is always legal and that yields a detectable result (i.e. not
One can use token-pasting, but that rules out a *vast* amount of input
because token-pasting must yield a valid single token (otherwise it is an
error). Many compilers allow it as a non-standard extensions or
oversight, but this is about standard C/C++. If that behavior is bad
(which I personally believe it is) change the standards.
The other way is by intentionally putting a (variadic) function-like macro
name in front of the argument and a () after the argument which may expand
against the argument and may expand against the trailing () that you
added. However, if the argument ends in a function-like macro name, you
have no idea what that macro is, what its arity is, what it expands to,
The bottom line is that there is no general way to interact with the
argument that doesn't cause compiler errors with some input. Even worse,
the token-pasting approach scenario rules out a *vast* amount of input.
There, at least, with a good preprocessor, you get a veritable detection
or a compiler error. With the function-like macro approach, there's no
telling what will happen. It depends heavily on whatever that function-
like macro expands to. It may cause an error, it may not. When it
doesn't it may yield an incorrect answer to the IS_EMPTY predicate.
>> Regardless, the scenario here is fundamentally wrong-headed. A macro
>> argument may be empty, but that does not change the number of
>> arguments. E.g.
>> #define A(...)
>> A() // one argument (*not* zero arguments) B(,) // two arguments
>> C(,,) // three arguments
> I completely agree with that part. In P99 I use that in particular to
> provide default arguments for functions, and there it is important to
> have A() detect that this is an empty argument and to produce the
Such detection is fine provided the input is suitably restricted. The
problem is that that is highly domain-specific rather than general-
purpose. One could have two separate IS_EMPTY-type macros that have
different domain restrictions. For example, Chaos has an
IS_EMPTY_NON_FUNCTION for the latter case. I don't remember off hand
whether I made the other, though it is easy enough. Regardless, neither
of these could be used in any sort of data structure implementation. How
to interpret emptiness, to the degree that it is detectable, is domain-
specific and under the purview of the user not the general purpose library.
> (Default arguments with macros is probably not an issue for boost, since
> this is for C++ which has default arguments in the core language. P99 is
> for C99 which hasn't)
>> A( ) // one argument containing a space
>> The only to have a function-like macro that takes zero arguments is to
>> define the macro as nullary.
> I didn't capture what you try to say, here.
#define A() // nullary macro definition
#define B(x) // unary macro definition
A() // invocation with 0 arguments
B() // invocation with 1 argument
A( ) // invocation with 0 arguments thanks
// to special rules for nullary macros
B( ) // invocation with 1 argument which contains
// a single space due to the whitespace
// compression from an earlier phase of
#define E // defined as nothing
#define C(x) B(x)
A(E) // error
C( E E ) // B ends up getting invoked with 3 spaces
// as the compression doesn't happen this
// late the process
In this latter case, there is no semantic effect that can be determined
from the whitespace. The only way whitespace can be significant from this
point on is during stringizing and there only when is inside of an
argument being stringized (which is specifically compressed) versus not
inside the argument. E.g.
#define S1(x) S2(x)
#define S2(x) #x
#define N() E E
S1(+N()+) // "+ +"
Moreso, even with something like
The space before the 'b' is part of the argument. So
#define P(x, y) S1(+y)
P(1,2) // "+"
P(1, 2) // "+ 2"
I don't know of a single preprocessor (aside from maybe Hartmut's) that
always deals with whitespace correctly. However, even if such
preprocessors where common, you'd have to design every part of the library
extremely carefully to have predicable behavior in this type of corner
Boost list run by bdawes at acm.org, david.abrahams at rcn.com, gregod at cs.rpi.edu, cpdaniel at pacbell.net, john at johnmaddock.co.uk