From: Beman Dawes (bdawes_at_[hidden])
Date: 2002-07-08 15:56:29
We've been working for a long time to improved Boost regression testing and
Step one was to move the regression tests over to Jam. Thanks to Dave,
Rene, and the other folks working on the Boost build system, that's been
working for quite a while now.
Step two is generating the HTML status table for the regression
tests. I've been working on that; it has taken a long time because it is
being done in C++, and required developing a filesystem library first.
Some of the objectives for the status tables were to make them easier to
use. Specifically, the format of rows has been changed to provide links to
the documentation for each library being tested, to the .cpp file being
tested, and to the specific reasons for failed tests.
A sample table is available at http://www.esva.net/~beman/jam_regr.html
You can view it there directly, but for the library and test file links to
work, you will have to download it to you boost-root/status directory,
along with http://www.esva.net/~beman/jam_log.html
Notice that some of the "Fail" links aren't working yet; the first two
don't work for example. I'll try to fix the link failures over time. Any
suggestions for understanding what is happening in those cases would be
appreciated. The raw jam output log is available at:
It seems to me that jam logs (whether raw or processed into HTML) would be
more readable if we cut down on the maximum number of errors. With
Metrowerks (the compiler being used for the tests) the current toolset sets
the maximum number of errors at 20. I'd like to see that cut down to 10 or
even 5. What do others think? Is this something we have to work out for
each compiler, or can we pick some number to use unless experience dictates
Any comments would be appreciated.
The program which creates the status HTML is available at:
The program which creates the log HTML is available at:
Boost list run by bdawes at acm.org, gregod at cs.rpi.edu, cpdaniel at pacbell.net, john at johnmaddock.co.uk