|
Boost : |
Subject: Re: [boost] Review Request: Variadic Macro Data library
From: Paul Mensonides (pmenso57_at_[hidden])
Date: 2011-02-21 20:20:28
On Mon, 21 Feb 2011 18:29:46 -0500, Edward Diener wrote:
> On 2/21/2011 2:37 PM, Paul Mensonides wrote:
>> Most C/C++ developers perceive macro expansion mechanics to be similar
>> to function call mechanics. I.e. where a user "calls" a macro A, and
>> that macro "calls" the macro B, the macro B "returns" something, which
>> is, in turn "returned" by A. That is fundamentally *not* how macro
>> expansion behaves. The perceived similarity, where there is none
>> (going all the way back to way before preprocessor metaprogramming) is
>> how developers have gotten into so much trouble on account of macros.
>
> OTOH users of macros are not concerned, as developers should be, of how
> the macro expands. They are just given a macro syntax to use which the
> developer supposes should feel natural to them.
Natural for the domain, not natural because it matches the underlying
language.
>> I take serious issue with anything that intentionally perpetuates this
>> mentality. It is one thing if the syntax required is the same by
>> coincidence. It's another thing altogether when something is done to
>> intentionally make it so.
>
> I really feel you are stretching your case for why you do not like
> #define SOME_MACRO(...) as opposed to #define SOME_MACRO((...)). I do
> understand your feeling that variadics can be more easily misused than
> pp-lib data types. But to me that is a programmer problem and not your
> problem.
1) I don't like the lack of revision control available with the MACRO
(...) form. That form simply doesn't scale.
2) I don't like duplicating interfaces to remove a pair of parentheses--
particularly to make it look like a C++ function call.
3) I don't like libraries that choose to increase compile time for no
functional gain and only a perceived syntactic gain (and a minor one at
that).
My dislike really has very little do with misusing variadic content.
>> #define REM(...) __VA_ARGS__
>>
>> #define A(im) B(im) // im stands for "intermediate"
>> // (chaos-pp nomenclature)
>> #define B(x, y) x + y
>>
>> A(REM(1, 2)) // should work, most likely won't on many preprocessors
>
> I understand your concerns. But I don't think you can do anything about
> how programmers use things. You provide functionality because it has its
> uses. If some of the uses lead to potential problems because of
> programmer misunderstanding or compiler weakness you warn the
> programmer. That's the best you can do without removing decent
> functionality just because of programmer misuse or compiler fallability.
> Of course good docs about pitfalls always help.
Unfortunately, that's not the way it works. When a library doesn't work
on a compiler, that ends up being the library's problem, not the
toolchain vendor's problem. Look at VC++, for example, after all these
years the pp-lib *still* needs all of the blatant hacks put in place to
support it and MS still basically says #@!$-off. And (unfortunately) it
*has* to be supported.
>>>> BOOST_VMD_DATA_TO_PP_TUPLE(...)
>>>> -> (nothing, unless workarounds are necessary)
>>>
>>> I know its trivial but I still think it should exist.
>>
>> It is quite possible that workarounds need to be applied anyway to
>> (e.g.) force VC++ to "let go" of the variadic arguments as a single
>> entity.
>
> I will look further into this issue. I did have a couple of VC++
> workarounds I had to use, which I was able to solve thanks to your own
> previous cleverness dealing with VC++.
The main issue is that a problem may not surface until somewhere "far
away". E.g. it may get passed around through tons of other stuff before
causing a failure.
>> If the use case is something like what you mentioned before:
>>
>> #define MOC(...) /* yes, that's you, Qt */ \
>> GENERATE_MOC_DATA(TUPLE_TO_SEQ((__VA_ARGS__))) \ /**/
>>
>> Then why does the TUPLE_TO_SEQ((__VA_ARGS__)) part matter to the
>> developer who invokes MOC?
>
> Because I am not converting a tuple to a seq but a variadic sequence to
> a seq, and I feel the syntax should support that idea.
Right, but in this context, you're the intermediate library developer.
You are providing some domain-specific functionality (e.g. generating Qt
bloat), but you're doing preprocessor metaprogramming to do it. So, in
this case, you're giving your users the "pretty" syntax that you think is
important and encapsulating the significant metaprogramming behind that
interface.
When you're doing significant metaprogramming, what is syntax? Why is
there an expectation that a DSEL like a metaprogramming library have any
particular syntax?
>> I'm a lot more opposed to going back from a proper data structure to an
>> "argument list".
>
> Then I will go back to an 'element list" <g>. If the end-user uses it as
> an "argument list" you can sue me but not for too much because I am
> poor. <g><g>
What ultimately happens is that user's try to do something, and they run
into problems (often due to compilers), but they don't have the know-how
to work around the problems or do something in a different way. So, they
go to the library author for help. Providing that help can be a lot of
work, require a lot of explanation, implementation help, etc.. Like most
other Boost developers, I've had to do that countless times. Failing to
do that, however, fails to achieve (what I consider to be) one of the
fundamental purposes of publishing the library as open-source to begin
with. This is particularly true in the Boost context, which is largely
about advancing idioms and techniques and "improving the world". I.e.
though not exclusively, there is a lot of altruism involved.
Regards,
Paul Mensonides
Boost list run by bdawes at acm.org, gregod at cs.rpi.edu, cpdaniel at pacbell.net, john at johnmaddock.co.uk