Boost logo

Boost :

From: Mat Marcus (mat-boost_at_[hidden])
Date: 2003-11-15 14:20:08

--On Saturday, November 15, 2003 5:57 AM -0500 Brian McNamara
<lorgon_at_[hidden]> wrote:

> On Fri, Nov 14, 2003 at 03:16:50PM -0800, Mat Marcus wrote:
>> --On Friday, November 14, 2003 2:29 PM -0500 Brian McNamara
>> > I think what you are asking is "whose responsibility is it to
>> > write lines (3) and (4)--the author of Clonable or the author of
>> > Foo"?
>> This is more or less what I was asking. One possible difference is
>> that when I think concept I think of a *multi-sorted* specification
>> (that a collection of types and operations might model). It is
>> useful to discuss your simple (e.g. single sorted) examples.
> So, does "multi-sorted" just mean "concepts involving more than one
> type" (like your DragDropable example below)?


>> > - "Clonable comes before Foo"
>> >
>> > Imagine that Clonable is a type class in the "standard library".
>> > Today you write Foo. If Foo is a Clonable, then you (the author
>> > of Foo) also write the instance declaration.
>> How would this work for DragDropable when Foo and, say, Bar and a
>> pair of operations are together DragDropable but the interfaces
>> for Foo, bar, etc. are loosely coupled and come packaged in
>> different headers.
> If I understand your question, it would simply be
> import FooModule
> import BarModule
> import DragDropModule
> instance DragDropable Foo Bar where
> -- get_icon :: Foo -> Bar -> Icon
> get_icon f b = {- ... -}
> -- drop :: Foo -> Bar -> Void
> drop f b = {- ... -}
> but I think I am missing what you are asking.

You didn't miss what I was asking. I was just trying to bring out the
point that if you do this more then things begin to look ugly, as you
mention in your "Looks ugly? Tough :-)" comment below.

>> > - "Foo comes before Clonable"
>> >
>> > Take the specific case where Foo is "Int", for example. The
>> > author of Clonable, who "invents" the type class some time after
>> > a number of instances have already been invented, should do his
>> > best to write instance declarations for existing types (like Int)
>> > which could easily conform. (As more time passes, the "gaps"
>> > will get filled in.)
>> I worry whether this could lead to fat interfaces of loosely
>> related boilerplate declarations. Does it in practice?
> I'm not sure what you're asking here (perhaps it gets answered
> below).
>> And can you, for example, declare that a pointer to any type is a
>> model of Dereferanceable in one fell swoop?
> Yes. For example
> instance (Clonable x)=> Clonable [x] where
> clone l = map clone l -- recall: "map f l" applies f to each
> -- element of list l
> makes a "list of T" be declared to be Clonable for all Clonable T.

Good, this is important.

>> > - "Clonable and Foo were developed separately"
>> >
>> > Suppose they come from two different vendor libraries, and today
>> > you decide to use those two libraries together. You, the client
>> > of these libraries, will have to write the instance declaration.
>> > In the implementation you may have to "smooth over" the
>> > interface: we can imagine that perhaps Foo natively provides a
>> > "duplicate" method, so you have to write
>> > instance Clonable Foo where
>> > clone aFoo = aFoo.duplicate() // using C++ syntax to
>> > illustrate idea
>> >
>> > Etc. There are probably other scenarios, too.
>> Yes, I've also been wanting this in C++. The ability to
>> rename/remap operations to allow achieve conformance seems highly
>> desirable. But
> Well, you can already do this in C++ (using the age-old "any problem
> in computer science can be solved by adding an extra layer of
> indirection").

Let me restate. I've been wanting something like this as part of a
proposed C++ concept mechanism for C++ '0x. The current papers don't
yet appear to provide for a remapping facility.

>> somehow if the map is the identity map I don't want to manage the
>> declaration. But I am still digesting your strong typing/weak
>> typing analogy so this may change.
> Aha! There is an important, yet subtle, point looming here that
> deserves to be brought to light.
> When the concept precedes the model (which is the "ideal", and
> probably also the most common case), there is no need for any map at
> all. That is, if someone had long ago defined
> namespace cool_container {
> template <class CoolContainer>
> typename cool_iterator_traits<CoolContainer>::iterator_type
> begin( CoolContainer c );
> template <class CoolContainer>
> typename cool_iterator_traits<CoolContainer>::iterator_type
> end( CoolContainer c );
> }
> so that the standard idiom for iterating over a container was
> C c; // some cool_container
> typedef cool_iterator_traits<C>::iterator_type I;
> for( I i = cool_container::begin(c);
> i != cool_container::end(c); ++i )
> whatever( *i );
> then no one would ever write begin()/end() functions/methods for
> user-defined data types like "list" or for builtin types like arrays.
> Instead people would _specialize_ the _existing_ begin()/end()
> functions.
> That is, there is no reason to write your own function named foo()
> and then specialize FooConcept::foo() to call your foo(). You only
> specialize the existing name, since you are modeling the concept.
> Put another way, it is rare in Haskell for an instance declaration to
> just define functions which "forward the work elsewhere". Instead,
> what usually happens is that the instance declarations are the one
> and only place where the "work" is defined.

IIUC, the above paragraph surprises me. I imagine that it is not
uncommon for operations in a generic program participate in modeling
multiple concepts (possibly unrelated in the concept taxonomy).

> I feel like I am being long-winded and I am not sure if I am
> communicating what I am trying to say. I guess my point is, if you
> find yourself having to "manually specify the identity map" (or any
> "map", for that matter), then somewhere along the line you have
> probably already dropped the ball. Instance definitions are meant
> to be _the_ definitions, not just a common storehouse for a map to
> the real definitions which are scattered elsewhere.

I don't feel that you are being long winded. I've been specifically
asking you to describe how things work in Haskell in practice and
you've been patiently responding. If folks are bored they can ask me
to stop asking you these questions. In any case, I'm finding your
posts useful and I appreciate that you are taking the time to write

The part about "instance definitions are meant to be _the_
definitions" still seems foreign. That is, I expect that an arbitrary
operation may participate in modeling multiple concepts. I am
reluctant to commit to letting an operation be "owned" by a particular
instance definition. As I see it, such a notion of ownership is one of
the overconstraints imposed by the OOP/interface-based programming
paradigms. This puts me back on my "inheritance/member functions
considered harmful" track.

>> > - the first client to ever try to use two separate libraries (one
>> > which defines foo(), and the other which defines T) together
>> > he will not, because someone "upstream" will have already done it
>> > for him.
>> What about the second client of the two separate suppliers (of
>> foo() and T)? Presumably the first client will not want to intrude
>> on the headers supplying foo and T. Does the first client supply a
>> new header that establishes the desired name conformance?
> Yes. In the extremely likely case that the interfaces do not match
> exactly, he also write the "glue code" to massage the interfaces
> accordingly (that unfortunate "map" we were discussing above). (This
> is all the intrinsic penalty we suffer for lack of design
> omniscience.)
>> Does the second client add to this file?
> No, he #includes it.
>> What then are the dependencies?
> Clonable Foo
> \ /
> Client1Glue
> |
> Client2
> Looks ugly, eh? Tough! :)

Just to recap, is the point that you are willing to accept this
ugliness in order to avoid accidental structural conformance? Do the
arguments about speed fade somewhat when we include headers full of
instance definitions?

> Really, the point is, this problem is _intrinsic_to_the_situation_.
> This is not a side-effect of "declaring conformance using a
> type-class like mechanism". No. It is a side-effect of "wanting to
> use two independently developed libraries together when the
> libraries do similar things but use different interfaces". The only
> way to avoid complex dependencies like this is to either (1)
> refactor or (2) have had more foresight from the outset.

In the structural conformance world view, if both libraries happen to
conform structurally then I don't see client 2 depending on client 1
in any way. Of course in the structural conformance with remapping
scenario we may meet similar problems.

>> >> In practice this would seem to lead to users avoiding generic
>> >> functions, just as the need to write helper functions to use STL
>> >> today is one barrier to acceptance (which is why we like lambda).
>> >
>> > Indeed, "instance declarations" must be as succinct as possible.
>> Locality of declaration may also be important.
> I dunno if I agree, since it begs the question, "local to whom"?
> If we want to say
> instance DragDropable Foo Bar ...
> are you saying this should be "local" to DragDropable? Foo? Bar?
> Clearly it must be "downstream" of all of them.
>> > The framework I'm envisioning succeeds in the cases where the
>> > concept precedes the model. In those cases, the author of the
>> > model declares conformance to the concept using inheritance:
>> >
>> > // To say that "list_iterator models ForwardIteratorConcept",
>> > // write code along the lines of
>> > class list_iterator : public forward_iterator_tag { ... };
>> And also input_iterator_tag, etc.?
> Ah, oops; my code above probably should have said
> "bidirectional_iterator_tag" (or whatever the appropriate "most
> refined concept(s)" is(are)).
>> How do you establish your concept hierarchy?
> I dunno, something like
> template <class T>
> class DerivedConcept : public refines<BaseConcept<T> > { ... };
> maybe? When I said above "The framework I'm envisioning", it is by
> no means a "concrete" vision.

I wasn't thinking so much of your particular framework. I was mainly
curious how you might answer given that you are informed by the idioms
in Haskell.

>> Also I don't yet see how this scales to the multi-sorted case.
> In the case of
> instance DragDropable Foo Bar ...
> I imagine that the best way to express it would be along the lines of
> class Foo : public drag_dropable_tag<_1,Bar> { ... };
> However in more complex cases, such as
> -- Foo can be DragDrop-ed onto any Quxable type
> instance (Quxable q)=> DragDropable Foo q ...
> I think we'd have to fall back on template specialization:
> class Foo { ... };
> template <class Quxable>
> struct DragDropableConcept<Foo,Quxable> {
> static const bool value = QuxableConcept<Quxable>::value;
> };
> (If your point is "no matter how you try to implement this in current
> C++, it will turn into a horrible ugly mess", then consider the
> point well-taken. :) )

No that was not my point. I really did want to see how it might scale.
I am still trying to improve my understanding of how concept-like
features work in other languages to guide my opinion on how they
should look in C++ '0x. Incidentally, even today I find
boost::function_requires and the concept archetypes to be rather

 - Mat

Boost list run by bdawes at, gregod at, cpdaniel at, john at