Subject: Re: [proto] [phoenix] not playing nice with other libs
From: Eric Niebler (eric_at_[hidden])
Date: 2011-05-04 04:58:40
(cross-posting to the Proto list and cc'ing Hartmut.)
On 5/2/2011 6:18 PM, Thomas Heller wrote:
> On Mon, May 2, 2011 at 12:54 PM, Eric Niebler <eric.niebler_at_[hidden]> wrote:
>> The following trivial program fails to compile:
>> #include <boost/phoenix/core/limits.hpp>
>> #include <boost/numeric/conversion/converter.hpp>
>> It generates the following:
>> In file included from
>> from main.cpp:2:
>> ../../../../branches/release/boost/mpl/multiplies.hpp:38: error: wrong
>> number of template arguments (12, should be 5)
>> error: provided for âtemp
>> late<class N1, class N2, class N3, class N4, class N5> struct
>> Phoenix is changing the following fundamental constants:
>> IMO, Phoenix shouldn't be touching these. It should work as best it can
>> with the default values. Users who are so inclined can change them.
> This problem is well known. As of now I have no clue how to fix it properly.
> Let me sketch why i changed these constants:
> 1) Phoenix V2 has a composite limit of 10:
> This is equivalent to the number of child expressions a expression can hold.
> This is controlled by BOOST_PROTO_MAX_ARITY for the number of
> template arguments for proto::expr and proto::basic_expr
> 2) Boost.Bind can take up to 10 parameters in the call to boost::bind
It's still not clear to me why you're changing
BOOST_MPL_LIMIT_METAFUNCTION_ARITY and BOOST_PROTO_MAX_LOGICAL_ARITY.
> The default BOOST_PROTO_MAX_ARITY is 5.
I see. So this is inherently a limitation in Proto. I set Proto's max
arity to 5 because more than that causes compile time issues. That's
because there are N*M proto::expr::operator() overloads, where N is
Proto's max arity and M is Proto's max function call arity. However:
- IIRC, Phoenix doesn't use proto::expr. It uses proto::basic_expr, a
lighter weight expression container that has no member operator overloads.
- Compile time could be improved by pre-preprocessing, like MPL. That's
something I've been meaning to do anyway.
- The max function-call arity can already be set separately from the max
number of child expressions.
- The compile-time problem is a temporary one. Once more compilers have
support for variadic templates, all the operator() overloads can be
replaced with just one variadic one. Which should be done anyway.
The solution then is in some combination of (a) allowing basic_expr to
have a greater number of child expressions than expr, (b) bumping the
max arity while leaving the max function call arity alone, (c)
pre-preprocessing, (d) adding a variadic operator() for compilers that
support it, and (e) just living with worse compile times until compilers
catch up with C++0x.
Not sure where the sweet spot is, but I'm pretty sure there is some way
we can get Proto to support 10 child expressions for Phoenix's usage
scenario. It'll take some work on my end though. Help would be appreciated.
> The BOOST_RESULT_OF_NUM_ARGS constant needed to be changed because i
> needed to provide 11 arguments in a "call" to boost::result_of. But i
> guess a workaround
> can be found in this specific case.
What workaround did you have in mind?
> I wonder what qualifies as "User". Phoenix is certainly a user of mpl,
> result_of and proto. Spirit is a user of proto and phoenix. Spirit needs an arity of 7 (IIRC).
By "user" I meant "end-user" ... a user of Boost. You have to consider
that someone may want to use Phoenix and MPL and Numeric and ... all in
the same translation unit. We shouldn't make that hard. This
proliferation of interdependent constants is a maintenance nightmare.
I tend to agree with Jeff Hellrung who said that Phoenix should make do
with the defaults and document any backwards incompatibilities and how
to fix them. But we should make every effort such that the defaults Just
> Anybody got any ideas?
> One idea that comes to my mind is having a phoenix::proto_expr,
> which is a proto::basic_expr, basically. Not sure if that would work though
I don't like special-casing for Phoenix. Other libraries have the same
Hartmut, you have done some work on a Wave-based tool to help with the
pre-preprocessing grunt-work, is that right?
-- Eric Niebler BoostPro Computing http://www.boostpro.com
Proto list run by eric at boostpro.com