Boost logo

Proto :

Subject: Re: [proto] [phoenix] not playing nice with other libs
From: Eric Niebler (eric_at_[hidden])
Date: 2011-05-04 13:32:42

On 5/4/2011 6:25 PM, Thomas Heller wrote:
> On Wed, May 4, 2011 at 10:58 AM, Eric Niebler <eric_at_[hidden]> wrote:
>> On 5/2/2011 6:18 PM, Thomas Heller wrote:
>>> The default BOOST_PROTO_MAX_ARITY is 5.
>> I see. So this is inherently a limitation in Proto. I set Proto's max
>> arity to 5 because more than that causes compile time issues. That's
>> because there are N*M proto::expr::operator() overloads, where N is
>> Proto's max arity and M is Proto's max function call arity. However:
>> - IIRC, Phoenix doesn't use proto::expr. It uses proto::basic_expr, a
>> lighter weight expression container that has no member operator overloads.
> Correct. But we also need the arity in:
> proto::call, proto::or_ and maybe some others

I'd like more details here, please. You never really *need* to increase
BOOST_PROTO_MAX_LOGICAL_ARITY because you can nest multiple proto::or_'s
and proto::and_'s. And if you need that many, you might think about
refactoring your grammar. Proto::or_ can be more efficiently rewritten
as proto::switch_, for instance.


>> The solution then is in some combination of (a) allowing basic_expr to
>> have a greater number of child expressions than expr, (b) bumping the
>> max arity while leaving the max function call arity alone, (c)
>> pre-preprocessing, (d) adding a variadic operator() for compilers that
>> support it, and (e) just living with worse compile times until compilers
>> catch up with C++0x.
>> Not sure where the sweet spot is, but I'm pretty sure there is some way
>> we can get Proto to support 10 child expressions for Phoenix's usage
>> scenario. It'll take some work on my end though. Help would be appreciated.
> Yes, I was thinking of possible solutions:
> 1) splittling the expressions in half, something like this:
> proto::basic_expr<
> tag
> , proto::basic_expr<
> sub_tag
> , Child0, ..., Child(BOOST_PROTO_MAX_ARITY)
> >
> , proto::basic_expr<
> sub_tag
> >
> >
> This would only need some additional work on the phoenix side.
> Not sure if its actually worth it ... or even working.

Not this. It's like that early prototype of Phoenix where every
expression was a terminal and the value was a Fusion sequence of other
Proto expressions. You can't use Proto's transforms to manipulate such

Admittedly, Proto is rather inflexible when it comes to how children are
stored. My excuse is that I do it to bring down compile times.

> 2) Have some kind of completely variadic proto expression. Not by
> having variadic
> templates but by creating the list of children by some kind of cons list.
> This might requires a quite substantial change in proto, haven't fully
> investigated
> that option.

You would go from instantiating 1 template per node to instantiating N
templates, where N is the number of child nodes. This is then multiplied
by the number of nodes in an expression tree. Not good.

>>> The BOOST_RESULT_OF_NUM_ARGS constant needed to be changed because i
>>> needed to provide 11 arguments in a "call" to boost::result_of. But i
>>> guess a workaround
>>> can be found in this specific case.
>> What workaround did you have in mind?
> Calling F::template result<...> directly, basically reimplementing
> result_of for our
> phoenix' own limits.

As an implementation detail? Sure, no problem.

>>> I wonder what qualifies as "User". Phoenix is certainly a user of mpl,
>>> result_of and proto. Spirit is a user of proto and phoenix. Spirit needs an arity of 7 (IIRC).
>> By "user" I meant "end-user" ... a user of Boost. You have to consider
>> that someone may want to use Phoenix and MPL and Numeric and ... all in
>> the same translation unit. We shouldn't make that hard. This
>> proliferation of interdependent constants is a maintenance nightmare.
> I agree. I don't think there really is a general solution to that.
> There have been reports
> by Micheal Caisse of some macro definition nightmare while using MSM
> together with spirit.
> If i remember the details correctly, MSM changes the proto constants as well.
> This problem is not really phoenix specific!

Oh, yeah. MSM changes Proto's max arity to be 7. OK, I can see that 5 is
too low for folks. Proto needs some work.

>> I tend to agree with Jeff Hellrung who said that Phoenix should make do
>> with the defaults and document any backwards incompatibilities and how
>> to fix them. But we should make every effort such that the defaults Just
>> Work.
> I agree. One simple solution is to add a big fat warning to the docs
> saying to include
> phoenix/core/limits.hpp before anything else. However, that will not
> solve your reported problem.

Right, that's not what I had in mind. The warning would be like, "By
default, phoenix::bind only accepts up to X arguments, whereas
boost::bind accepts 10. If you want phoenix::bind to accept 10 also, you
have to do XYZ." Ditto for any incompatibilities with Phoenix v2.

>>> Anybody got any ideas?
>>> One idea that comes to my mind is having a phoenix::proto_expr,
>>> which is a proto::basic_expr, basically. Not sure if that would work though
>> I don't like special-casing for Phoenix. Other libraries have the same
>> problem.
>> Hartmut, you have done some work on a Wave-based tool to help with the
>> pre-preprocessing grunt-work, is that right?
> Yes. Hartmut implemented partial preprocessing for phoenix using wave.
> As an example on how to use it see this file:
> To preprocess phoenix call:
> wave -o- -DPHOENIX_LIMIT=10 libs/phoenix/preprocess/preprocess_phoenix.cpp
> You need to have a file called wave.cfg in your current directory.
> An example configuration can be found at:

OK, I'll dig into this, hopefully this weekend. FYI, I've committed a
change that makes proto::expr::operator() use variadics when they're

With any luck, pre-preprocessing will get us to the point where I can
just bump Proto's max arity to 10 and not suffer any degradation in
compile times. I'll also need to investigate why Proto depends on

Eric Niebler
BoostPro Computing

Proto list run by eric at