Boost logo

Boost :

From: Fernando Cacciola (fcacciola_at_[hidden])
Date: 2001-12-06 12:03:00


----- Original Message -----
From: Ullrich Koethe <u.koethe_at_[hidden]>
To: <boost_at_[hidden]>
Sent: Thursday, December 06, 2001 1:36 PM
Subject: Re: [boost] Formal review: New Boost.Test Library

> Fernando Cacciola wrote:
> >
> > ) I don't like the names of the macros 'BOOST_WARN_MESSAGE',
> > 'BOOST_CHECK_MESSAGE', 'BOOST_REQUIRE_MESSAGE', they don't do what they
are
> > sematically implying: That is, 'check message' implies: 'check *the*
> > message, not 'check *this* and eventually show a message'. I suggest....
> > something else :) [just couldn't come up with something]
> >
>
> Since I might be the one who once invented these names: I also didn't
> like them very much, but couldn't find any better ones either. So,
> suggestions please...
>
I'll try to think of something...

> > ) I *strongly* disagree with having floating point and collection
equlity
> > tools as part of the test library. They are orthogonal to the unit test
> > framework. I would agree to provide, in the proper place, a floating
point
> > comparator and a sequence matcher, but they should be outside the test
suite.
> >
>
> I *strongly* disagree with your opinion. Perhaps you are right from the
> standpoint of purity and beauty, but in practice floating point and
> collection equality are among the most frequent tests. So, until someone
> implements the external module you are proposing, they *belong* to the
> unit test framework. One can still refactor this later.
>
I'm not saying that we shouldn't provide a facility to test floating point
and collection equality.
I'm not saying neither that the Unit Test Library distribution shouldn't
come with these facilities until they are factored out.
What I'm saying is that I prefer these facilities being decoupled.

Specifically:

Currently we have:

#define BOOST_CHECK_CLOSE(left, right, tolerance_or_number)
#define BOOST_CHECK_WEAK_CLOSE(left, right, tolerance_or_number)

The test is too much embebbed in the test framework.
What if I need a different comparison scheme.
What if the comparison scheme I need is so different that even the 3rd
argument is irrelevant or not enough.

What I suggest is to provide something like:

template<class T> bool fp_equal(T const& l,T const& r) ;
template<class T> bool fp_equal2(T const& l,T const& r, T const& tol) ;
template<class T> bool fp_equal32(T const& l,T const& r, T const& tol) ;

(and something similar for sequences),

and put this in a separate header, unrelated to any test stuff.
In the meantime, we can distribute this header along with the test library
so that the framework can be used right out of the box.
But there is no gain in conceptually coupling these things.

> [As an aside: it would be really helpful for testing if there were some
> support for determining the right tolerance for floating point
> comparisons automatically on every platform, depending on the expected
> error of the computation. Would this be part of your module?]
>
AFAIK, there isn't any really right tolerance.
It always depends on the input and the sequence of operations besides the
platform and required error bound.
The best think we -as library providers- can do is to provide a set of
different schemes, some with fixed epsilon-based tolerances and with user
provided tolerances, properly documented. Users would have to choose the
right one.
Gennady presented a starting point along this way but I couldn't finish to
evaluate it.

Fernando Cacciola
Sierra s.r.l.
fcacciola_at_[hidden]
www.gosierra.com


Boost list run by bdawes at acm.org, gregod at cs.rpi.edu, cpdaniel at pacbell.net, john at johnmaddock.co.uk