From: Geoffrey Irving (irving_at_[hidden])
Date: 2006-03-14 13:39:59
On Tue, Mar 14, 2006 at 01:11:07PM -0500, Edward Diener wrote:
> Paul Giaccone wrote:
> > I'm having problems with deserialising floating-point (double) values
> > that are written to an XML file. I'm reading the values back in and
> > comparing them to what I saved to ensure that my file has been written
> > correctly. However, some of the values differ in about the seventeenth
> > significant figure (or thereabouts).
> > I thought Boost serialization used some numerical limit to make sure
> > that values are serialised exactly to full precision, so what is
> > happening here?
> This is a common cause of errors when using floating point values.
> Writing a floating point value to a string representation, as are XML
> values, and attempting to read that string representation back, does not
> guarantee that the floating point value will remain exactly the same
> since there are a number of floating point values which have no exact
> representation in the C++ floating point formats. That is simply because
> of the nature of floating point representation used in C++ and most
> modern languages. After all, the number of floating point values within
> any range of numbers is infinite while the C++ floating point
> representation cab not be. The only way to guarantee what you want for
> floating point values is to write and read back to a binary
> representation of the value.
It should still be possible to write out a value that can be uniquely
mapped back to the exact bits. For example, you could write out enough
digits that the exact value is the closest representable number at the
In my experience, having the i/o system treat floating point values
exactly is extremely important to debugging, especially when dealing
with chaotic systems (which includes most scientific applications).
If you need your program to run for three hours to find a corner case
that crashes it, and restarting the program from an output dump "fixes"
the problem due to a change in the last bit, debugging becomes nearly
Binary output solves this, but disallowing text output for debugging
purposes would be unfortunate.