|
Boost Users : |
Subject: Re: [Boost-users] A forward iterator need not be default-constructible
From: Krzysztof Żelechowski (giecrilj_at_[hidden])
Date: 2011-10-03 10:46:07
Dave Abrahams wrote:
> *rewinds to opening post*
>
> Lack of documentation aside, what's the underlying problem there? Oh,
> the result of bind1st is not default-constructible? Well, that's what
> you get for using deprecated components ;-)
It would help if you could offer a replacement that does the same thing,
compiles in traditional mode and does not involve defining a dedicated
iterator class.
>
> More seriously... I understand the problem. What's obnoxious is that
> the standard doesn't have a consistent view of the importance of Regular
> Types (c.f. Stepanov). Those binders are not Regular since they don't
> have a default-constructor... well, neither are many iterators because
> they don't all have a total ordering, but I think the notion of Regular
> may have been expanded since '98... but anyway, yeah, let's just say the
> standard's inconsistent view of default construction causes an
> interoperability problem.
>
> For the record, I'm torn about the whole "Regular Types" thing. I can
> see the argument for it, but it also forces weaker invariants.
>
> Now, I *think* what you want is for the Range library to "degrade
> gracefully" when you don't satisfy its concept requirements, and the
> appropriate thing to do there is shut off concept checking.
Do not switch all concept checking off if it fails; switch default
constructibility check off always.
> If the
> Range library were to put its "concept checking stamp of approval" on a
> nonconforming iterator it would be failing to provide an assurance I
> want: that the resulting iterator can be used with *any* algorithm
> requiring iterators. If I can get an error later by passing the
> resulting iterator to some algorithm that happens to use default
> construction, then I have a right to complain that concept checking is
> broken.
>
My point is, I am unable to imagine an algorithm that requires constructing
iterators out of thin air. Some algorithms do that, for no other reason
than the programmer did not know better, and they can be fixed to avoid this
construct, and when they get fixed, the code gets better. So requiring
default construction of iterators leads to looser library code. If you
happen to know such an algorithm, please share it with us.
>>>> Being a singular iterator is not a concept, it is a run-time property.
>>>> The compiler cannot check whether an operator is singular,
>>>> it is equivalent to the halting problem.
>>>
>>> It can't check whether an iterator is random-access either. All (good)
>>> concepts have semantic constraints that can't be checked by the
>>> compiler.
>>
>> An algorithm using a bidirectional iterator for a random-access iterator
>> will still work, only it will take longer to accomplish.
>
> No. A bidirectional iterator can provide different semantics (or invoke
> undefined behavior) for random-access iterator operations that are not
> part of the bidirectional iterator concept.
But I can use the operations defined on a bidirectional iterator to simulate
the operations of a random access iterator, canât I?
>
>> Also, being a singular iterator is independent of type, while being a
>> random-access iterator is determined by type.
>
> No again. Objects of this type are not singular iterators:
>
> struct nonsingular
> {
> private:
> void operator=(nonsingular const&);
> };
Why do you call this thing an iterator?
>
> Singular values crop up in all kinds of contexts, BTW. Do ints support
> division? Well, yes, unless the denominator is zero.
>
Best regards,
Chris
Boost-users list run by williamkempf at hotmail.com, kalb at libertysoft.com, bjorn.karlsson at readsoft.com, gregod at cs.rpi.edu, wekempf at cox.net