|
Boost : |
From: Paul Mensonides (pmenso57_at_[hidden])
Date: 2006-07-13 04:39:21
> -----Original Message-----
> From: boost-bounces_at_[hidden]
> [mailto:boost-bounces_at_[hidden]] On Behalf Of Matt Calabrese
Everything above here is not suited to being part of the pp-lib. Not that there
is anything wrong with it (necessarily), just that it is a *use* of the
preprocessor library rather than a potential *part* of it.
> Finally, if none of that is of immediate interest, the
> underlying mechanism for all of this may be. What I do in
> order to get polymorphic behavior for different kinds of
> functions (normal functions, member functions, and
> templates), is I use a form of object system. The concept is
> simple, but is applicable to a large amount of areas in
> preprocessor metaprogramming, and I offer it as a potential
> fundamental change to the way Boost.Preprocessor is
> interfaced with by simulating macro overloading, allowing for
> fewer named functions but with the same functionality and for
> more simple creation of generic preprocessor algorithms.
>
> The concept is simply this: An object is represented by the form
>
> ( OBJECT_ID, (object_data) )
>
> Where OBJECT_ID is a unique "type" for the object and
> object_data is the internal implementation fo the container.
Chaos does something similar to this with container data types. E.g.
CHAOS_PP_SIZE( (CHAOS_PP_ARRAY) (3, (a, b, c)) ) // 3
CHAOS_PP_SIZE( (CHAOS_PP_LIST) (a, (b, (c, ...))) ) // 3
CHAOS_PP_SIZE( (CHAOS_PP_SEQ) (a)(b)(c) ) // 3
CHAOS_PP_SIZE( (CHAOS_PP_STRING) a b c ) // 3
CHAOS_PP_SIZE( (CHAOS_PP_TUPLE) (a, b, c) ) // 3
> The way it works is you "construct" an object by calling a
> macro which takes in the data and assembles it into the
> object representation just described, then, internally when
> calling overloaded" macros arguments are forwarded to a macro
Chaos also provides facilities to simulate overloaded macros, default arguments,
and optional arguments. E.g.
#define MACRO_1(a) -a
#define MACRO_2(a, b) a - b
#define MACRO_3(a, b, c) a - b - c
#define MACRO(...) \
CHAOS_PP_QUICK_OVERLOAD(MACRO_, __VA_ARGS__)(__VA_ARGS__) \
/**/
MACRO(1) // -1
MACRO(1, 2) // 1 - 2
MACRO(1, 2, 3) // 1 - 2 - 3
> which internally concatenates the function name to the
> OBJECT_ID and forwards the arguments once more along with the
> internal representation of the object. The result is a
> seemingly "overloaded" or "virtual" macro. As an example of
> how this could be extremely useful for Boost.Preprocessor is
> it allows for an easy way to represent consistent container
> concepts.
The problems with using something like this in the pp-lib are efficiency and
scalability. It takes time to process sequences of elements generically. There
can be significant overhead compared to just directly using an algorithm
designed to operate on a particular data type. Nor can you have a simple direct
dispatch mechanism, as this requires you to write many algorithms each time that
you add a new data type. It also doesn't deal with algorithms that take
multiple sequences of elements as input:
CHAOS_PP_APPEND(
(CHAOS_PP_SEQ) (a)(b)(c),
(CHAOS_PP_LIST) (x, (y, (z, ...)))
)
// (CHAOS_PP_SEQ) (a)(b)(c)(x)(y)(z)
The way that Chaos does it is that it defines the algorithms themselves
generically relying only on a small set of core primitives per data type (i.e.
HEAD, TAIL, IS_CONS, etc.). Then, when there is a significantly more efficient
(or interesting) way that a particular algorithm can be designed for a specific
data type, it provides that algorithm non-generically.
The main problem is efficiency. The kinds of programming (relative to the
complexity of the problem to be solved--not the complexity of the solution)
where generics would be useful are the kinds of programming where the
inefficiencies really add up.
Regards,
Paul Mensonides
Boost list run by bdawes at acm.org, gregod at cs.rpi.edu, cpdaniel at pacbell.net, john at johnmaddock.co.uk