From: Timothy M. Shead (tshead_at_[hidden])
Date: 2007-08-05 12:16:56
David Abrahams wrote:
> However, output serialization is only important because both
> Boost's process_jam_log and ctest/dart currently use a
> parse-the-build-log strategy to assemble the test results. This
> approach is hopelessly fragile in my opinion. Instead, each build
> step in a test should generate an additional target that contains
> the fragment of XML pertaining to that test, and the complete
> result should be an XML file that is the catenation of all the
> fragments. Rene is working on that feature for BBv2. I don't
> expect it would take more than a couple of days to add this feature
> to either build system, so the advantage of serialization may
> be easily neutralized.
There may be some semantics here that I'm missing, but I think ctest is
already doing exactly what you describe:
* ctest checks the process return value from each test. A nonzero value
equals test failure (unless the test is marked as an expected failure).
* ctest (optionally) parses the test's output (stdout/stderr) using a
configurable regular expression. If there is a match, the test fails.
The intent is to catch errors reported by some standard logging
mechanism, if such a mechanism exists, e.g. messages of the form
"CRITICAL ERROR: blah blah blah ...". You can disable this feature if
you don't want it.
* ctest produces an XML file that describes the results of each test,
including pass/fail, execution time, and test output (stdout/stderr).
* ctest also parses the test output (stdout/stderr) for
<NamedMeasurement> tags that are incorporated into the final XML. Tests
can use this mechanism to pass internally-generated metrics into the
test output in an unambiguous way.
* ctest (optionally) uploads the final concatenated XML to a dart server
where it can be displayed using a web browser.
I've attached a very short sample of ctest output XML from a real-world
project. It's been trimmed-down to a single test case, normally this
file contains hundreds of tests.
Boost list run by bdawes at acm.org, gregod at cs.rpi.edu, cpdaniel at pacbell.net, john at johnmaddock.co.uk