Boost logo

Boost Users :

From: Jason Sachs (jmsachs_at_[hidden])
Date: 2008-05-29 15:42:16


I have a bunch of data I am trying to store in shared memory which is
basically a vector of bytes + a vector of some metadata; both vectors, in
general, grow as time goes on at an unpredictable rate, until someone stops
using them. The vector lengths are extremely variable; total amount of
shared memory could be as small as 10K and as large as several hundred
megabytes. I will not know beforehand the amount needed.

>From what I understand about Boost shared memory segments, they are not
infinitely growable, so I think I have to organize my data into chunks of a
more reasonable size, and allocate a series of separate shared memory
segments with a few chunks in each. (there are other application-specific
reasons for me to do this anyway) So I am trying to figure out what is a
reasonable size is of shared memory to allocate. (probably in the 64K - 1MB
range but I'm not sure)

Does anyone have any information about the overhead of a shared memory
segment? Just an order of magnitude estimate, e.g. M bytes fixed + N bytes
per object allocated + total size of names. Are M,N on the order of 10 bytes
or 10Kbytes or what?



Boost-users list run by williamkempf at hotmail.com, kalb at libertysoft.com, bjorn.karlsson at readsoft.com, gregod at cs.rpi.edu, wekempf at cox.net