|
Boost : |
From: Paul Giaccone (paulg_at_[hidden])
Date: 2006-03-14 12:38:48
I'm having problems with deserialising floating-point (double) values
that are written to an XML file. I'm reading the values back in and
comparing them to what I saved to ensure that my file has been written
correctly. However, some of the values differ in about the seventeenth
significant figure (or thereabouts).
I thought Boost serialization used some numerical limit to make sure
that values are serialised exactly to full precision, so what is
happening here?
Example:
Value in original object, written to file: 0.0019075645054089487
Value actually stored in file (by examination of XML file):
0.0019075645054089487 [identical to value written to file]
Value after deserialisation: 0.0019075645054089489
It looks like there is a difference in the least-significant bit, as
examining the memory for these two values gives:
Original value: b4 83 9b ca e7 40 5f 3f
Deserialised value: b5 83 9b ca e7 40 5f 3f
(where the least-significant byte is on the left)
Note the difference in the first bytes.
I'm using Boost 1.33.1 with Visual Studio 7.1.3088 in debug mode.
Paul
Boost list run by bdawes at acm.org, gregod at cs.rpi.edu, cpdaniel at pacbell.net, john at johnmaddock.co.uk