#include <boost/pool/object_pool.hpp>

 

class test

{

    test(void){};

        

public:

 

    std::vector<const int*> m_a1;

    int* m_b1;

    int* m_b2;

    const int* m_c1;

    const int* m_c2;

    const std::string* m_d1;

   unsigned int m_e1;

    unsigned int m_e2;    

    unsigned char m_f1;

    unsigned char m_f2;

    unsigned char m_f3;

    unsigned char m_f4;

};

 

 

static boost::object_pool<test> s_my_pool;

 

int main(int argc, char* argv[])

{

    int max = 2097120;

    for(int i = 0; i < max; i++)

    {

                test* tst_c = s_my_pool.construct(test());

               

    }

 

Consider the code above. Compiled with Visual Studio 2010. The purpose of the class is not important. It is its size (48 bytes) that matters. The goal is to use object_pool to minimize memory usage.

When running this code the memory rises with just a few bytes more than the needed 48 bytes * 2.097.120 = 100.661.760 bytes, hence the object_pool seems to be working fine.

But if I increase the number of elements created by one (to 2097121) the memory use rises to a massive 200.000.000+ bytes. It seems that the object_pool has reached a threshold and that the strategy then is to double the allocated memory. But this can’t be right, can it? If so the object_pool is useless because it’s using more memory than a simple test* tst =  new test(); would have done.

Why is object_pool acting like this and is there a way to change the behavior?

PS! The threshold value is not exactly the same in another application I have made, but it is pretty close.

 

 

Best Regards

Christian Berg