|
Boost Testing : |
From: Doug Gregor (dgregor_at_[hidden])
Date: 2005-06-02 12:17:12
On Jun 2, 2005, at 11:21 AM, David Abrahams wrote:
> Pretty nice. Here's the thing: compared to my original suggestion,
> it's a bit hard for me to see if any of my libraries are failing.
Part of your suggestion was for something like this:
Iterators: 12 regressions, 14 new warnings since release 1.32.0:
gcc-2.95-3: test1, test2, test3
msvc-8.0: test7, test9
...
Graph: 3 regressions since release 1.32.0
Python: 245 regressions since release 1.32.0:
gcc-2.95-3: test1, test2, test3
msvc-8.0: test7, test9
This would (I think) have the same problems as the issues list that's
available now. We could have a table of contents at the top of the
page, that links down to the set of failing tests for each library.
Then you could skim the table of contents to see if any of your
libraries are failing.
> One other point: when I click through any of the failures, I get "page
> not found."
You also see empty parens ( ) where you should see the name of the
regression test runner. Something is amiss and I'm looking into it as
time permits. Click on the library name to see the full results for the
library.
Doug