Boost logo

Boost :

Subject: Re: [boost] [unordered] unordered_set::erase() complexity bug?
From: Daniel James (daniel_james_at_[hidden])
Date: 2009-11-24 03:55:12


> 2009/11/23 Christopher Jefferson <chris_at_[hidden]>:
>>
>> One way to significantly reduce this problem is to use a different hash function, which shuffles the values more.

That will improve things, but not by much as there'll still be a lot
more buckets than elements. That would need to combined with regularly
manually rehashing the container (ie. calling 's.rehash(0)') as
elements are erased.

2009/11/23 Daniel James <daniel_james_at_[hidden]>:
>
> I could apply another hash function to the hash value to make this
> less likely, or perhaps store a pointer to the last non-empty bucket.
> I'll think about it.

Neither of those will work, there will still be other similar cases.
With the current interface the only way to deal with this that I can
think of is by changing the data structure, which will require more
memory use. And there's been opposition to that. It could possibly use
an iterator which only finds the next element when used, although
that'll make a lot of other operations more expensive.

Daniel


Boost list run by bdawes at acm.org, gregod at cs.rpi.edu, cpdaniel at pacbell.net, john at johnmaddock.co.uk