Boost logo

Boost :

From: Beman Dawes (bdawes_at_[hidden])
Date: 2002-08-05 19:33:44


At 07:19 PM 8/5/2002, Douglas Gregor wrote:

>On Monday 05 August 2002 06:46 pm, Beman Dawes wrote:
>> It would be easy for the program that generates the Status Reporting to
>> look in a file for a list of test/compiler/version[/platform]
>combinations
>> that need to be reported with some special status. The file could live

>in
>> CVS, so developers can maintain it for their own tests. IOW, it
wouldn't
>> be the responsibility of the people who run the regression tests to
>figure
>> out when it applied.
>>
>> Brainstorming a bit, let's say this is called the broken-compiler
file.
>If
>> test/compiler/version[/platform] names match the current cell, it will
be
>> reported as "Broken" or similar, with a link to a description that says
>> something like:
>>
>> "This compiler fails to comply with the C++ standard in some aspect
>> required to pass this test. The developer knows of no practical
>> workaround. Complain to the compiler vendor!"
>>
>> Is that what you had in mind?
>
>Yes, that's what I had in mind. Perhaps in the .cpp file containing the
test
>
>we could have lines like this:
>
>// XFAIL toolsetname1
>// XFAIL toolsetname2
>// ...
>
>The toolset name for each test is already available in the program that
>generates the status reports, right?

Yes. But I'm leery of messing with the .cpp files. My thought is that
centralizing the information makes review by test managers easy.

Another place it could go is embedded in comments in the Jamfile. I'm
already doing that for some other information, so am already scanning that
file. That might be a better way.

By the way, I've just done a fresh Win32 regression test run. I'll try to
run more than once a day when activity warrants. Particularly as we are
coming up to a release. The diff is to this morning's run.

--Beman


Boost list run by bdawes at acm.org, gregod at cs.rpi.edu, cpdaniel at pacbell.net, john at johnmaddock.co.uk