|
Boost Users : |
From: Sascha Ochsenknecht (s.ochsenknecht_at_[hidden])
Date: 2006-04-03 15:05:45
Hello,
I'm using the Serialization Library of Boost to store my data structure.
I want to use the binary archive type by default:
boost::archive::binary_oarchive(ostream &s) // saving
boost::archive::binary_iarchive(istream &s) // loading
But I noticed that these files can be very big compared to the stored
data. I got a binary archive with around 1.5GByte. That could be but
when I compress it I got only ~200MByte left (!).
It seems that there is a lot of 'overhead' data or 'redundant' data (I
see a lot of '0' when I look into it with an Hex editor).
i tried the gzip (...) filter of the Iostreams library, but I want to
avoid this for production due to increasing runtime.
Some Information about my data structure (maybe helpful):
- using a lot of pointer
- using a lot of std::vector
Does anybody investigate the same problem?
Is there a possibility to decrease the archive size but storing the same
amount of data?
What could be a solution? Writing an own/optimized (regarding to my data
structure) Archive class?
thanks in advance
Sascha
Boost-users list run by williamkempf at hotmail.com, kalb at libertysoft.com, bjorn.karlsson at readsoft.com, gregod at cs.rpi.edu, wekempf at cox.net