From: Andy Little (andy_at_[hidden])
Date: 2005-09-13 06:28:36
"Daryle Walker" <darylew_at_[hidden]> wrote
> OK. When you say "arbitrary precision," you mean that a precision limit
> must be set (at run-time) before an operation. Most people use "arbitrary
> precision" to mean unlimited precision, not your "run-time cut-off"
Are there really libraries that have unlimited precision?
What happens when the result of a computation is irrational?
Boost list run by bdawes at acm.org, gregod at cs.rpi.edu, cpdaniel at pacbell.net, john at johnmaddock.co.uk