|
Boost-Build : |
From: Steven Knight (knight_at_[hidden])
Date: 2003-06-16 14:38:07
> >> >> 1. Features. Those are quite important and nifty things. I believe Scons is
> >> >> using Environment for the same purpose?
> >> >
> >> > Yes (based on my quick scan through the documentation). An
> >> > Environment is where you set up how you want one or more products to
> >> > be built: use *this* compiler, *this* version of yacc, these flags,
> >> > these include paths, these libraries, etc...
> >>
> >> I always imagined that many features might end up being translated
> >> into Environment settings, but I guess another possibility is that we
> >> just bypass the Environment so that its "smarts" don't get in the way
> >> :(.
> >
> > Hmm, maybe I gave you the wrong impression. Environments are actually
> > pretty dumb, they're basically just dictionaries of values that get
> > plugged in to how you build things. They're also *the* way to interact
> > with the SCons build engine.
>
> I'm sorry, I guess I meant the smarts of the things that turn
> Environment settings into command-lines.
Even that's pretty dumb, basically just substituting variables. The
smarts that *are* there are pretty configurable, though, because you can
call callables from within our variable substitution logic. So, for
example, here is the value for $_CPPINCFLAGS, which is the expansion
that turns $CPPPATH (the list of directories that get searched for
#include dependencies) into the right -I options:
'$( ${_concat(INCPREFIX, CPPPATH, INCSUFFIX, __env__, RDirs)} $)'
The bracketing $(-$) tell SCons to ignore those options for signature
calculation, so that changing -I options doesn't cause an automatic
recompile. (The recompile, if necessary, will occur because changing
-I options may cause you to #include a different file than you did
previously, which is detected.)
The ${-} brackets the expansion, which gets eval'ed, so an expansion can
actually contain (somewhat) arbitrary Python code.
_concat() is our internal function that does the heavy lifting,
although it's pretty simple: It just takes $INCPREFIX (configurable;
/I on Windows, -I on POSIX by default) and $INCSUFFIX ('' by default,
sometimes necessary to set it to '/') and appends them to the beginning
and end of every directory listed in $CPPPATH.
__env__ is the current environment, in case the function needs to fetch
additional values. RDirs is a function that _concat() runs the path
names through so that the resulting -I options work with our Repository
logic.
> >> If we want people who specify features to have a uniform way to
> >> express them, and if we don't think the Environment is going to cover
> >> all of our needs, we may have to do that. I'd rather that we're all
> >> able capitalize on one-another's knowledge of tools and platforms,
> >> though.
> >
> > I think that's covered. The tools that we support are each in a module
> > that contain the information about how that tool needs to be built.
>
> I'm not talking about how tools are built at this point, only how they
> are invoked.
Sorry, I mistyped. I meant how they get invoked to build other
things.
> > What we don't do right now is tie the tools in different tool chains
> > together as tightly as I'd like. It's *theoretically* possible, for
> > example, that a given build run will configure the MinGW compiler and
> > the Visual Studio linker in the same Environment. In practice, it's not
> > a problem because if your PATH finds the MinGW compiler first, it'd be
> > really, really weird for it to not find the corresponding linker first,
> > too...
>
> We don't like the idea of relying on that sort of thing. In fact, we
> allow/encourage building with multiple toolchains in a single
> invocation.
It's not invocation by invocation, but environment by environment. You
can build as many different variants using as many different tool
chains in a single invocation as you'd like.
What I was referring to was the (potentially) dangerous practice of
linking objects compiled with one tool chain with the linker from
another tool chain. Right now, it's possible to mix and match in ways
that shoot yourself in the foot, although you kind of have to try to do
that, so like I said, it hasn't been a big deal to fix.
> > So you can basically hook up Builders arbitrarily using 'src_builder',
> > and when env.Program() is invoked, we walk back through the list of
> > src_builders until we find a chain that leads back to the specified
> > source suffixes. So you end up just listing the input source files:
> >
> > env.Program('foo', ['f1.o', 'f2.c', 'f3.y', 'f4.s'])
> >
> > And the build engine works out the internal details based on how the
> > builders are configured.
>
> There are real scenarios where that procedure will find a suboptimal
> dependency/transformation graph. The search prototype I pointed you
> at is designed to find the graph which requires the minimal number of
> transformations: you dump all the allowed transformations into a soup
> and it just figures it out (and tells you if there's ambiguity).
That sounds like it could be very valuable for us to incorporate.
> >> That was just an example. There are lots of other common options,
> >> such as enabling/disabling debug symbols, optimizations, ... I think
> >> Volodya's question is whether there's a general framework for handling
> >> these things.
> >
> > Yes, there is a framwork for this.
> >
> > We have a separate module for each of the tools that we support,
> > each module with two interface functions: one searches for the tool
> > and returns a value that says, "Yes, they have compiler X installed
> > in a PATH that this Environment can get to;" the other actually
> > initializes an Environment with all of the appropriate values so that
> > the Environment can use the tool, creating any necessary Builders or
> > doing anything else that's required to set up things properly.
>
> Now you're talking about tool setup and configuration, which is a
> separate topic. I was trying to describe the translation of abstract
> concepts like "debugging enabled" into command-line options like "-g".
This has come up periodically, and I've always shied away from it.
The slippery slope that I always see here is: how do you accomodate
differences in levels of debugging (or optimization, or...)? There's
not necessarily any equivalence of debugging levels between different
versions of the same tool, let alone different tools. An optimization
level that works fine for *my* code might break your code using version
2.95 but be all right for you in 2.96--and all of a sudden you have
to break open your tidy abstraction and get grungy with versions and
options.
In other words, I think any sort of abstraction like this is, at some
level, inherently tied to the code you're compiling with it. your
underlying product is code, well, maybe that makes sense. Now, since
Boost is about delivering code, and (as I understand it) delivering
reliable library behavior across platforms and tool sets, I can see how
it might make sense for you to bite off on this sort of abstraction for
your product.
So my initial inclination is that this capability would stay specific to
your interface, but, hey, if you have a framework that really looks like
it works in the general case, I might be convinced... :-)
> OK, that's nice. A much more important issue for Boost would be "can
> a build system be packaged in such a way that someone installing (or
> building it from scratch) doesn't get the sense he's installing
> Python. Lord knows why; I guess some companies are regressive in
> that way.
>
> Honestly, I'm not sure it's so important: if we could just provide
> prebuilt executables for a few major platforms I bet we could get
> away with telling everyone else to install Python.
As I think I mentioned, prebuilt executables are on the agenda, but
haven't been high enough priority yet.
> > Thanks for the interest. This sounds like it could be really cool to
> > bring this stuff together. What's prompting the move in this direction,
> > anyway?
>
> No matter what we do to it, the Jam language still has limits in its
> expressiveness, limits in speed due to representing everything as
> lists of strings, etc. We have added one advantage that Python
> doesn't have: optional type declarations. The core build engine of
> Jam that we inherited from Perforce, its dependency evaluator, etc.,
> is a pile of twisty 'C' code. From watching bug reports on their
> list, Perforce doesn't seem to have regression/unit tests for their
> code so we're reluctant to try to keep tracking Perforce's codebase.
> We have evolved the core of Boost.Jam sufficiently far from what
> Perforce is doing that it's difficult to gain much advantage from the
> existence of another "supported" tool,...
>
> ...get the picture?
And how. Been there, done that; didn't even get a T-shirt... :-)
--SK
Boost-Build list run by bdawes at acm.org, david.abrahams at rcn.com, gregod at cs.rpi.edu, cpdaniel at pacbell.net, john at johnmaddock.co.uk