|
Boost Users : |
Subject: Re: [Boost-users] A forward iterator need not be default-constructible
From: Dave Abrahams (dave_at_[hidden])
Date: 2011-10-03 11:09:47
on Mon Oct 03 2011, Krzysztof Żelechowski <giecrilj-AT-stegny.2a.pl> wrote:
> Dave Abrahams wrote:
>
>> *rewinds to opening post*
>>
>> Lack of documentation aside, what's the underlying problem there? Oh,
>> the result of bind1st is not default-constructible? Well, that's what
>> you get for using deprecated components ;-)
>
> It would help if you could offer a replacement that does the same thing,
> compiles in traditional mode and does not involve defining a dedicated
> iterator class.
You could pass it through boost::function. I'm not sure if boost::bind
makes default-constructible things... nor am I sure about phoenix/lambda.
>> Now, I *think* what you want is for the Range library to "degrade
>> gracefully" when you don't satisfy its concept requirements, and the
>> appropriate thing to do there is shut off concept checking.
>
> Do not switch all concept checking off if it fails; switch default
> constructibility check off always.
Yes, but that would be wrong, if the library is going to advertise
that it checks for the standard iterator concepts. On the other hand,
if the library doesn't advertise its checks, then that's up to the
author.
>> If the Range library were to put its "concept checking stamp of
>> approval" on a nonconforming iterator it would be failing to provide
>> an assurance I want: that the resulting iterator can be used with
>> *any* algorithm requiring iterators. If I can get an error later by
>> passing the resulting iterator to some algorithm that happens to use
>> default construction, then I have a right to complain that concept
>> checking is broken.
>>
>
> My point is, I am unable to imagine an algorithm that requires
> constructing iterators out of thin air. Some algorithms do that, for
> no other reason than the programmer did not know better, and they can
> be fixed to avoid this construct, and when they get fixed, the code
> gets better. So requiring default construction of iterators leads to
> looser library code. If you happen to know such an algorithm, please
> share it with us.
I don't. It doesn't matter whether the requirement was wrong, though.
>>> An algorithm using a bidirectional iterator for a random-access iterator
>>> will still work, only it will take longer to accomplish.
>>
>> No. A bidirectional iterator can provide different semantics (or invoke
>> undefined behavior) for random-access iterator operations that are not
>> part of the bidirectional iterator concept.
>
> But I can use the operations defined on a bidirectional iterator to simulate
> the operations of a random access iterator, canât I?
No; you can't measure the distance between two arbitrary bidirectional
iterators unless you know which one comes first. But my point is that
if the iterator is truly bidirectional but it advertises itself to be
random-access, you have no way of knowing to use multiple ++ invocations
instead of one +=, and += could do something arbitrarily horrible.
>>> Also, being a singular iterator is independent of type, while being a
>>> random-access iterator is determined by type.
>>
>> No again. Objects of this type are not singular iterators:
>>
>> struct nonsingular
>> {
>> private:
>> void operator=(nonsingular const&);
>> };
>
> Why do you call this thing an iterator?
Exactly my point; I don't.
-- Dave Abrahams BoostPro Computing http://www.boostpro.com
Boost-users list run by williamkempf at hotmail.com, kalb at libertysoft.com, bjorn.karlsson at readsoft.com, gregod at cs.rpi.edu, wekempf at cox.net