Boost logo

Boost :

From: Kevin Lynch (krlynch_at_[hidden])
Date: 2001-11-06 18:01:32

Ed Brey wrote:

> 1. Many constants are the return values of functions taking a certain parameter. In a
> perfect world, these wouldn't be constants at all, but would simply be written sqrt(2) or
> gamma(1/3.), or the like.

> 2. To keep at least some uniformity, for each constant that is a function return value,
> the actual function should be available. Sqrt and cos are done for us. But gamma
> constants should not be added without a gamma function.

> 3. Given the above observations, only group 1 (as I've suggested) constants should be
> considered for standardization. The trend should be to move the language away from
> performance hints to compilers (e.g. inline), rather than toward it. Intrinsicly
> performing high-quality compile-time sqare root should become a QoI issue for compilers
> that users can count on, just like what is possible today for operator*(double,double).

I read your post with great interest, and agreed in principle with much
of it... however (going out of order.... :-)

There are a number of issues here that are thorny and subtle, I think,
that need to be considered. While I agree with you that in a perfect
world the math functions would provide some sort of guarantees on their
precision, such guarantees are difficult in general to obtain, since
different algorithms generate different precision/run time tradesoffs,
perhaps even for different ranges of the arguments (eg. algorithm A may
be much faster near 1 than near 2 for equally good precision, while B
may provide equal speed everywhere but be much more accurate near 1 than
near 2). It is often the case that you trade off substantial run time
for moderate precision gains and I don't think that the standard should
be mandating precision or run time guarantees in that light. It would
be much better, in my opinion, if the standard required that
implementations document the precision/run time parameters of their math
functions in some detail; many standard library implementations are hard
to use in high performance computing because just such information is
missing in the library documentation (if you don't understand the
library, and a 5% performance penalty is going to cost you two weeks of
runtime, you're going to end up paying big bucks for specialized
libraries that give you the information you need, and many of us can't
afford that sort of expense)

However, if you are only talking about compile time translation of
functions taking literal arguments, that may be a different story; it
might not be unreasonable to expect compilers (some day) to generate
precise values for expressions that are compile time calculable;
sqrt(2.) is an excellent example, but what about sqrt(5./13.)? What
about sqrt(3./2.)?
Since sqrt(3./2.) = sqrt(3.)/sqrt(2.), should we expect such
simplifications? How far should the standard go? I like the idea in
concept, but I don't know how far we can expect compiler writers to go
in this direction (Even though I'd like to see them go as far as
possible! but then again, I'm no compiler writer :-) ; after all,
providing much of this optimization would require them to implement
computer algebra systems in their optimizers, and I don't know if
they'll want to be doing that...

The other issue I thought of relates to your classification of
constants: pi versus sqrt(2.), for example. You stated that you'd like
to see the library, if possible, maintain a "function call interface" to
those things that look like function calls, and provide only those
"constants" that are most "natural as constants". The problem I see is
that this isn't as clear a distinction as you might like: pi = cos(-1),
e = exp(1), i = sqrt(-1), (ok, pi may not pass your "naturalness"
criterion, but exp(1) certainly should) etc. The problem only gets
worse as we consider more constants, and I'm not sure if there might be
a better definition of the separation. I agree, however, that there is
in principle little "need" to standardize those constants that are a
simple sequence of +-/* away from other constants (if pi/2 loses
substantial precision, then your platform is really broken, and this
library isn't going to help you anyway).

Finally, I disagree that only those constants which are function return
values for which the actual function is available should be considered
(eg. sqrt(2) would be in, but not gamma(1./3.)) Consider that many of
the functions that are "missing" will likely be in the TR, since it will
likely suck in the C99 library (cbrt, tgamma, erf, etc. will all be
coming in....).

I hope that didn't sound too much like the ramblings of someone
desperately trying to avoid doing more work before leaving for home :-)

Kevin Lynch				voice:	 (617) 353-6065
Physics Department			Fax: (617) 353-6062
Boston University			office:	 PRB-565
590 Commonwealth Ave.			e-mail:	 krlynch_at_[hidden]
Boston, MA 02215 USA

Boost list run by bdawes at, gregod at, cpdaniel at, john at