|
Boost Testing : |
From: David Abrahams (dave_at_[hidden])
Date: 2007-12-07 21:54:49
Pursuant to the conclusions of
http://lists.boost.org/Archives/boost/2007/08/125724.php that our
reporting system ought to be integrated with Trac, Daniel Wallin and I
have installed a modified Bitten plugin (http://bitten.edgewall.org/)
on the boost-consulting Trac server. We are using it with a few slave
machines to test (a mirror of) the Boost subversion repository.
You can see the results at:
https://boost-consulting.com/trac/projects/boost/build
https://boost-consulting.com/trac/projects/boost/build/trunk
https://boost-consulting.com/trac/projects/boost/build?view=inprogress
and by following the obvious links on those pages. Some features (I'm
probably forgetting something; please poke around and see what you can
discover):
* runs tests for each atomic Boost checkin.
* slaves can run incremental or from-scratch tests
* failure markup is processed very quickly in Python on the machine
doing the testing and used to alter the effect of failures, with no
horrendous and time-consuming XSLT step
* no tests are run for libraries that are marked unusable on a given
os/compiler
* no unreliable jam log processing is involved: Boost.Build is
directly generating XML which is processed by python and shipped up
to the Trac server
* only failed tests that aren't marked up are shown in most views
* a record of each build run is kept in the Trac server's database
* reorganizing the system so that several machines can contribute to a
single test run with a given OS/toolset is fairly straightforward.
There are obviously a few rough edges and things we could do better in
the UI, but I really believe this overall approach is a winner.
Daniel and I are fairly competent Trac and Bitten hackers, so we
should be able to make this system do whatever Boost needs, feed back
the improvements to the Bitten community, and leave us with a
maintainable codebase that isn't permanently tied to one or two
people.
It's time to open a discussion of exactly how the system should
evolve, what features we want to see in the UI, etc. I hope those of
you with an interest in the quality of Boost testing tools will join
us on the boost-testing list to discuss this project's future. If
anyone wants to run an additional testing slave, please let me know;
once you have Python on your system, it's extremely easy to set up.
Regards,
Dave
Postscript: Quick Overview of Bitten for Boost people
=====================================================
Bitten is a continuous integration system that's integrated with Trac.
Testing machines (called "slaves") run a Python script (part of the
Bitten distribution) that checks periodically with the server to see
if anything needs to be tested. The server manages a set of testing
"recipes," each of which is associated with a directory in the
subversion repository and a set of slave criteria (such as "are you
running Linux?"). If a recipe has criteria matching the inquiring
slave and there have been changes in the associated part of the
repository, the slave executes the recipe.
A recipe is divided into a series of "steps," each of which is
composed of commands
(http://bitten.edgewall.org/wiki/Documentation/commands.html). Slaves
report back to the server as they execute each step, so the server can
tell you about the progress of each slave. If any step fails, the
build fails and the slave stops. Recipes are expressed in XML and
stored in the server's Trac database. Trac users with the right
privileges can create, edit, and delete recipes.
-- Dave Abrahams Boost Consulting http://www.boost-consulting.com