|
Boost Users : |
Subject: Re: [Boost-users] Writing large binary files with boost gzip
From: Steven Watanabe (watanabesj_at_[hidden])
Date: 2011-04-07 16:28:54
AMDG
On 04/07/2011 12:09 PM, Anders Knudby wrote:
> Hello all, please help! I'm truly banging my head against the wall here. I
> am trying to write a function that will allow me to write gzipped binary
> files (~50 MB) and read them again. I have data in memory. Here's my code
> for writing:
>
> namespace io = boost::iostreams;
>
> filename = "c:/test.bin.gz"; //My output file
> int size = 5000000; //Data size in bytes, ~5 MB
>
> //Create filtering_ostream
> io::filtering_ostream out; //Creates a filtering_ostream called out
> out.push(io::gzip_compressor()); //Assigns the gzip_compressor to out
> out.push(io::file_sink(filename)); //Assigns a file sink to out
>
> char* memblock = new char [size]; //This is my data. In reality memblock
> will have been created earlier and filled with real data
>
> out.write(memblock, size); //Do the writing
>
> delete[] memblock; //Clean up
>
>
> As written above the resulting file, c:/test.bin.gz, is corrupt. If I try to
> decompress it, either with gzip or winrar, I get an error message. However,
> if I instead set size = 4000000 (~4 MB) (slightly smaller), the resulting
> file works just fine. My problem therefore is that my actual size is 50
> MB...
>
Have you checked that the stream is closed correctly?
I tried on Linux, and it seemed to work fine
after a added #includes/main etc.
In Christ,
Steven Watanabe
Boost-users list run by williamkempf at hotmail.com, kalb at libertysoft.com, bjorn.karlsson at readsoft.com, gregod at cs.rpi.edu, wekempf at cox.net