From: Vladimir Prus (ghost_at_[hidden])
Date: 2002-03-27 04:07:12
Steven Knight wrote:
> > > In the SCons testing infrastructure, all tests are self-contained
> > > Python scripts that execute tests in one or more temporary directories.
> > > Any necessary files are created from in-line Python strings.
> > I like the idea of using actual directory layout for specifying tree for
> > a simple reason. I can just create the tree on disk, play with it, and
> > the convert it into test with no effort. Or later, I can easily play with
> > the tree that a test uses, in case the test fails.
> That's a definite plus for many people; YMMV. The in-line requirement
> for our tests stems mostly from our use of the Aegis change management
> system, which works a lot better when tests are self-contained.
> The advantage I've found is that it makes the tests atomic. You don't
> have to worry about failures because someone forgot to list a file, or
> the state of the tree hasn't been re-set properly. Again, YMMV.
So far, I lean towards allowing real directory trees to be used as data for
> > My second thought is: does you
> > code allow to detect when a file was added/removed
> That would typically be handled by something like:
> test.fail_test(not os.path.exists(test.workpath('should_exist')))
But... this will detect if a file is present, not if it was *added*.
> > and, more importantly, when
> > it was touched/modified. I'm not sure this is straightforward -- attempt
> > to open an nonexisted file will just raise IOError,
> So far we've used os.path.getmtime() successfully:
> oldtime = os.path.getmtime('foo')
> test.fail_test(oldtime != os.path.getmtime('foo'))
Not that simple: you need a call to test.workpath in this case as well :-)
True, this works but I'd rather have a separate method for this.
> Note that you can run() an arbitrary program by using the appropriate
> keyword argument.
> TestCmd.py also supports use of regular expression matches on output,
> instead of exact matches, which comes in handy when we're checking
> output where the line numbers vary, for example.
These are good things. I'm currently leaning towards the following solution.
I'll take scons code and then will add a facility for remembering and
comparing the complete state of tree before and after build system
invocation. (e.g there will be no need for getmtime calls). There's a
question if it worth the trouble -- for me, the fact that I'll reuse existing
code settles the question.
In detail, I will:
1. Add a method to copy a an existing tree to the temporary area
2. Add a methon "run_build_system" which will
a) call 'run' method
b) traverse that temporary area and create in-memory tree
c) Compare the tree representation with the previous one, to
find additions/touches etc.
3. Add a number of methods which test tree differences against
expected, for example:
I have to observe that this would make use of direct filesystem operations a
bad style, bypassing the primary method, but I think this is OK.
> > > The feedback from the developers has been good. They've all found this
> > > framework makes writing SCons tests easy enough that they all include
> > > new or modified tests in their patches; I haven't had to crack down or
> > > bug anyone to provide tests. (Of course, it helps that they *do* know
> > > that tests are expected if they want their patch integrated... :-)
> > Sure :-) Actually, it should be noted that the code which actually senses
> > which files are added/changed etc. is "stolen" from SCM tool Subversion
> > (http://subversion.tigris.org) where it's used quite actively to write
> > tests, and seems like developers don't object either.
> Thanks for the pointer. I've heard good things about Subversion, but
> haven't had a reason yet to look closely. Is this within Subversion
> itself, or is it a separate testing infrastructure that Subversion
The latter -- Subversion, unlike Aegis, focuses solely on version control.
Boost-Build list run by bdawes at acm.org, david.abrahams at rcn.com, gregod at cs.rpi.edu, cpdaniel at pacbell.net, john at johnmaddock.co.uk