From: Darren Cook (darren_at_[hidden])
Date: 2004-06-05 20:30:45
I'm using boost::serializer between two programs: the first creates the data
then serializes it, the second loads that data in and analyzes it.
When I had just 420 data samples it all worked fine: I create a
vector<Sample> in memory, serialized it and loaded it in the other side.
When I moved to 7000 data samples the first program, the creator, sucked all
the memory from the machine and then some.
So I moved to serializing each data sample as it was created. To read it
back in I changed my code from this:
But every time it fails with an assert . This seems to be when it has
reached end of file. I cannot know in advance how many data samples I'll
write to disk, so it seems I have three options:
A: Make a special "zero" version of Sample to mark end of file
B: At end of program 1 write the number of samples to a special file, and
read that in before starting the above loop, so I know when to stop.
C: Alter the serializer class to throw exception instead of an assert; I
can then catch it and carry on.
What is the best way? Is adding a terminator byte to boost::serializer an
option (i.e. so I don't have to make a terminator version of each data
structure I want to serialize).
Incidentally my above loop does a copy of each sample as it adds it to the
vector. Is the above code going to be common enough to make it worth
including into boost::serializer as a helper function, which could then be
optimized with in-place construction or something clever like that? (And
then it could handle the termination handling itself as well.)
IStream>::load_binary(void*, unsigned int) [with Archive =
boost::archive::binary_iarchive, IStream = std::basic_istream<char,
std::char_traits<char> >]: Assertion `count ==
Boost list run by bdawes at acm.org, gregod at cs.rpi.edu, cpdaniel at pacbell.net, john at johnmaddock.co.uk