Boost logo

Boost :

From: Geoffrey Irving (irving_at_[hidden])
Date: 2006-05-02 13:48:47


On Tue, May 02, 2006 at 06:33:30PM +0100, John Maddock wrote:
> >> I tend to agree with the MS engineers here. I've found out only
> >> yesterday that the FPU/math library is not entirely deterministic in
> >> some calculations (including square roots and trigonometry, typical
> >> 3d stuff), so I think worrying about serialization/deserialization
> >> is useless.
>
> I disagree, it is certainly possible to serialise/deserialise exactly, glibc
> manages it OK, so I see no reason why MS can't.
>
> Square roots are exactly-rounded under IEEE arithmetic BTW, as are the usual
> + - * / operators: it's the functions that may return transcendental values
> (cos sin exp, pow) which can never give exact answers purely as a matter of
> principal: although in practice most implementations are last-bit-correct
> for the vast majority of inputs.

In practice, nothing can be assumed to be exactly rounded if it goes through
a decent set of optimizations, since the compiler gets to choose when values
go in and out of 80 bit registers.

> > Do you have example code / pointers to documentation for that? I've
> > always been under the impression that basic math is deterministic
> > regardless of
> > IEEE compliance, and would really like to know if/where there are
> > cases where that doesn't hold.
>
> I assume you've read "What Every Computer Scientist Should Know About
> Floating-Point Arithmetic" at
> http://docs.sun.com/source/806-3568/ncg_goldberg.html ?

Actually I have not read it in its entirety, though I will do that shortly.
I have skimmed it and read similar things in the past. However, I couldn't
find any mention of determinism in that document. There are plenty of
discussions of non-portability, but the determinism question is separate.
Specifically, I've been assuming the following:

  If I have a function that accesses no global source of nondeterministic
  (e.g., other global variables, threads, etc.), and I compile it once into a
  separate translation unit from whatever calls it (to avoid inlining or
  other interprocedural weirdness), and call it twice on the same machine
  at different times with exactly the same bits as input, I will get the
  same result.

I also usually assume that the compiler is determistic given the same set
of optimization flags on the same machine with the same environment.

If this assumption is false, it would be great to understand why. If the
answer is contained in that document, I apologize in advance for doubting it.

Thanks,
Geoffrey


Boost list run by bdawes at acm.org, gregod at cs.rpi.edu, cpdaniel at pacbell.net, john at johnmaddock.co.uk