|
Boost : |
From: Aleksey Gurtovoy (agurtovoy_at_[hidden])
Date: 2006-07-31 20:19:45
Gennaro Prota writes:
> On Mon, 31 Jul 2006 14:11:58 -0500, Aleksey Gurtovoy
> <agurtovoy_at_[hidden]> wrote:
>
>>Gennaro Prota writes:
>>> About the regression test mess, dynamic_bitset is not listed in the
>>> report, but has some "red" failures
>>
>>A "red" failure indicates a regression from the last-known-good
>>release, whether the toolset is marked as required or not, so it's
>>perfectly normal to have these present on the library's page and
>>absent from the Issues page.
>
> Ok. I realized this afterwards.
>
>>> on compilers which I don't think were tested before (which I've
>>> discovered by chance);
>>
>>Which compilers, and why do you think they weren't tested before?
>
> Not sure, but I don't remember dynamic_bitset<> being regression
> tested on CW 8.3.
Well, it was:
http://engineering.meta-comm.com/boost-regression/1_33_1/developer/dynamic_bitset_release.html
>>> and cases where the same compiler passes or not depending on the
>>> test runner (e.g. VC7).
>>
>>That happens, and when it does, it needs investigation.
>
> Aleksey, regression tests really need some attention. I understand
> that they are provided as a courtesy, that they don't come for free
> and all that, but definitely they need attention.
Surely there is a lot of room for improvement (as illustrated by
http://www.crystalclearsoftware.com/cgi-bin/boost_wiki/wiki.pl?Boost.Testing),
but the above sounds like you think they are in flux, and if so, you
should list the particular issues that led you to this conclusion, or
otherwise it's FUD. For instance, surely you could check 1.33.1 report
pages before making the claim that dynamic_bitset wasn't tested with
CW 8.3 (or any other toolset)?
> I've just fixed a hardcoded cvs repository link which still referred
> to the old cvsroot,
That's much appreciated.
> so that no link to source files actually
> worked. That's alarming, and it's not the test runner's fault: it
> either means no one follows the links or they give for granted that
> they do not work.
The former.
> Both are bad signals, IMHO.
I don't think it's that bad, really: library maintainers don't follow
the links because they know what their tests are, and have a working
copy anyway, and most users don't follow them because they are not
interested in that level of details (or may be they don't know where
to report a broken link, which is a problem, but hardly an indication
of the regression tests being in flux).
-- Aleksey Gurtovoy MetaCommunications Engineering
Boost list run by bdawes at acm.org, gregod at cs.rpi.edu, cpdaniel at pacbell.net, john at johnmaddock.co.uk