Boost logo

Boost-Build :

From: David Abrahams (david.abrahams_at_[hidden])
Date: 2002-07-03 11:04:11


----- Original Message -----
From: "Vladimir Prus" <ghost_at_[hidden]>

> > We're going to have to fix this in the end. When I try to build with
all of
> > my compilers in libs/python/test, the memory image is 70Meg. :(
>
> What I was trying to say is that "newstr" is probably not the worst
problem.
> There may be a lot of memory allocated during the jam run but is freed at
the
> end. So, strictly speaking, there's no memory leak -- just inefficient
memory
> usage. If this is the case, I suspect a major overhaul would be required
to
> fix it.

Yes, I realize that :(

I did some more measurements, and I don't think the newstr stuff is
accounting for most of the loss in my case. In my case I am building lots
of targets, and I have a feeling that changes the profile considerably.

When I dump all of the strings in the newstr cache, it comes out to around
6MB, which I consider reasonable. However, as I said, my peak memory image
is more than 10x that big.

If we wanted to reduce the size of the newstr cache, I think the quickest
thing we could do would be to store them in a sorted vector in
lexicographic order of the reversed string, then return the stored tail of
already-cached strings which have a tail match with the incoming string.

However, as I said, I don't think that's the main problem.

-Dave

 


Boost-Build list run by bdawes at acm.org, david.abrahams at rcn.com, gregod at cs.rpi.edu, cpdaniel at pacbell.net, john at johnmaddock.co.uk