Boost logo

Boost :

Subject: Re: [boost] [review] The review of Boost.DoubleEnded starts today: September 21 - September 30
From: Tim Song (t.canens.cpp_at_[hidden])
Date: 2017-09-26 00:43:40

Just some comments on devector for now, based on a quick read of the
documentation and implementation.


I don't find the unsafe functions sufficiently motivated. Introducing
footguns should require a compelling performance justification, but
there are no actual benchmarks. The unsafe_push_* functions encourage
heavy uses of reserve(), which, if done incorrectly, can easily
prevent the growth policy from working. Moreover, the design is
inconsistent. Why unsafe_push_* but not unsafe_emplace_*?

What's the motivating use case for the unsafe_uninitialized functions
that can't handled by a non-invariant-violating design like
default_init? The sole example in the documentation is a
devector<char> whose content will be filled by recv later, which can
be handled equally well. Also, assuming that this container meets the
allocator-aware container requirements, the user would have to
construct and destroy the elements via
allocator_traits<>::construct/destroy, not just placement new.
Otherwise, your container's destructor would be asking its allocator
to destroy something it didn't construct.

I'm somewhat surprised that dropping the move_if_noexcept/strong
exception safety dance doesn't seem to have been considered for a
container that's supposed to be performance-oriented.

> typedef pointer iterator;
> typedef const_pointer const_iterator;

Why are pointers being used as iterator types directly?


> class devector
> : Allocator

Even private derivation can interfere with overload resolution in
unexpected and undesirable ways. You have enough data members around
to leverage EBO without having to make devector itself derive from the
incorrectly calls push_back rather than emplace_back.

Insofar as I can determine, your clear() doesn't deallocate storage,
just overwrites the only allocator capable of deallocating the storage
you are currently holding - without deallocating it first.

More generally, implementation of allocator support requires
substantial improvement. An allocator that doesn't propagate on X is
not required to support X at all, but there's no handling for that in
your code. Another example: construct takes raw pointers, not
possibly fancy `pointer` like what you did here:

The fast path for range-insert-at-front
is broken based on a cursory inspection:

vector<int> x = {1, 2, 3};
devector<int> y = {4, 5, 6};
y.insert(y.begin(), x.begin(), x.end()); // {3, 2, 1, 4, 5, 6} (!)


Does your push_back etc. really have "amortized linear" complexity?
And doesn't that depend on the growth policy?

How can clear() have the postcondition "empty() &&
front_free_capacity() == 0 && back_free_capacity() == small buffer
size" yet not deallocate memory?

The default growth policy is:

> static size_type new_capacity(size_type capacity);
> Returns: 4 times the old capacity or 16 if it's 0.

Why those numbers? Were they arbitrarily chosen or based on some sort
of benchmark? If so, what?

Boost list run by bdawes at, gregod at, cpdaniel at, john at