|
Boost : |
Subject: Re: [boost] [range] [general] making member functionsSFINAE-friendly
From: Jeffrey Lee Hellrung, Jr. (jeffrey.hellrung_at_[hidden])
Date: 2013-02-19 19:38:53
On Tue, Feb 19, 2013 at 11:56 AM, Vadim Stadnik <vadimstdk_at_[hidden]> wrote:
> On Wed, Feb 20, 2013 at 3:12 AM, Jeffrey Lee Hellrung, Jr. <
> jeffrey.hellrung_at_[hidden]> wrote:
>
> > On Tue, Feb 19, 2013 at 2:04 AM, Vadim Stadnik <vadimstdk_at_[hidden]>
> > wrote:
>
[...]
> > Also, this design principle makes illegal advanced and/or augmented data
> > > structures which are beneficial for many complex algorithms. For
> example,
> > > insert() and erase() functions for a single element with O(log N) cost
> > are
> > > not allowed by the C++ standard, however, significantly less efficient
> > O(N)
> > > operations are allowed for std::vector.
> > >
> >
> > I'm not really sure what to make of this paragraph. The standard makes
> some
> > guarantees for std:: data structures, but you're free to provide your own
> > augmented data structures with whatever complexity guarantees make sense,
> > and I guarantee you wouldn't be violating any laws.
> >
> >
> This is a bit non-standard interpretation of the C++ standard :).
>
This = above or below? I guess below :)
> Augmented data structures have been recently discussed a number of times by
> Boost community, some of them are waiting for a review. There is a problem
> of recognizing the value of augmented data structures for STL, since
> formally, the subscript operator, function at() and iterators that support
> std::distance() with O(log N) computational complexity are currently
> non-standard features and thus many C++ professionals are at least cautious
> about new data structures.
>
I think you're overgeneralizing and/or exaggerating. IIRC, the value of
augmented data structures is certainly recognized (although I'm not sure of
what you mean by "value...for STL"). An operator[] which is O(log N) is not
unusual (see std::map), much less "non-standard".
Also, IIRC, one issue with an augmented data structure proposed in the
(near) past was whether it was appropriate to classify the iterators as
random access; I think such iterators (which have logarithmic-time
advancement and differencing) are objectively *not* random access
(according to the standard), but I agree there is a gap in the standard's
iterator classification between random access and bidirectional iterators
where an iterator of an augmented data structure may lie. As an event
simpler example, a zip_iterator< bidirectional, random access > has *at
least* bidirectional traversal, but additionally has constant-time
differencing that's lost once an algorithm classifies the iterator as
"simply" bidirectional. I think the present standard-endowed traversal
categories are motivated by the standard-provided algorithms, so staying
entirely within the standard, there probably isn't much need for further
traversal granularity.
Lastly, I think when you say "(non-)standard feature" you really mean
"(non-)standard-*like* feature" (and I have thus far been interpreting such
remarks more literally than you probably intended).
> I think that the current standard specifications for computational
> > > complexities of operations are to some degree obsolete. They create
> > > unnecessary restrictions that will be either amended or removed
> > completely
> > > in future versions of STL.
> > >
> >
> > Strongly disagree, and you haven't provided any examples of these
> > "unnecessary restrictions" (or you aren't being precise in what you
> mean).
> > Much code and many algorithms rely on the constant-time random access of
> > std::vector for their own complexity bounds; are you suggestion the
> > standard should drop this complexity guarantee?
> >
> >
> I can re-formulate the issue of STL design. After fixing for many years
> numerous performance bottlenecks caused first of all by linear cost update
> operations of std::vector, since it is the default STL container, I have
> come to the following conclusion.
>
Let's not forget such performance bugs are not restricted to the STL or
even C++ :)
http://thedailywtf.com/Articles/InternettoLowerCase.aspx
Right representation of math concepts through right interfaces is more
> useful than specification of computational complexities of single
> operations if a library provides a wide choice of interchangeable
> containers.
But then how do you choose the Right Container For The Job among several
choices with similar interfaces?
The design decisions based on the computational complexity of a
> single operation are just not practical. At the design time of a library it
> is not possible to know frequencies of all specific operations for all user
> algorithms. The parameter of the total computational cost of a user
> algorithm is more important in real life.
>
Sure, but that doesn't mean you won't have some informed ideas or concrete
information on the frequency of *some* specific operations for *some* user
algorithm. And this should certainly influence your design decisions,
including which data structures are appropriate to use.
> The solutions to current inefficiencies of STL
Can you provide examples of such inefficiencies?
> are available through
> advanced data structures, but they are non-standard due to the
> specification of computational complexities for single operations.
>
:: mentally replacing "non-standard" with "non-standard-like" ::
Sorry, I think I've veered too far from what your original gripe was. I'm
still not clear what the problem is with developing and using
"non-standard-like" containers (in the sense that maybe they don't offer
the same complexity guarantees of its set of operations as other standard
data structures...which I think calling them non-standard or
non-standard-like is misleading but whatever).
Something has to be changed in STL to take full advantage of these
> solutions.
>
- Jeff
Boost list run by bdawes at acm.org, gregod at cs.rpi.edu, cpdaniel at pacbell.net, john at johnmaddock.co.uk