From: Vladimir Prus (ghost_at_[hidden])
Date: 2002-01-14 10:55:16
David Abrahams wrote:
> > Transformations that I'm taling about are edges in
> > the transformation (or dependency) graph. I don't think that finding the
> > sequence of transformation is in any way different from finding
> > shortest/unique path.
> There's a big difference between shortest and unique, though ;-).
> > > > With all said, I don't think we should look for shortest path to find
> > > > transformation sequence -- we should look for unique one.
> > >
> > > I'm a little concerned about how that would affect the extensibility of
> > > system. Here's one simple example: when in "user mode" we might well
> > > to enable an executable to be generated directly from source files,
> > > intermediate .obj files when the toolset supports it. It would be nice
> > > simply enabling those transitions could do the job.
> > At this point we'd need to go back, I guess.
> > 1. What is the semantic of build request and subvariants? I'm not sure we
> > agree on this point.
> > One alternative is that build request for a main target specify all the
> > variants (property-sets) of that targets that we want to build.
> > Initially, not all subvariants are known -- e.g. <runtime-link> might not
> > be
> > By providing "active" features it's possible to change relevance-set &c.
> > Another alternative is that build request constrains possible
> > For example, cpp may be compiled in obj by many toolsets, but
> > <toolset>gcc in build request would allow only those compilations which
> > use gcc as toolset.
> > All found allowed transformation pathes create a new subvariant.
> I think I understand what you're getting at here. It doesn't sound as
> though it supports feature defaults very well, though. Does this mean that
> if I don't specify <struct-alignment> I end up with a subvariant for all 5
> of its values?
Good point. I definitely don't think a project should start builiding with
two versions of gcc just becase 3.0 was recently added to site-config.jam!
> Does it solve the problem of intermediate target generation? If so, how?
> Can you describe what happens in an example case? What controls when the
> generators fire, and in what order?
Each generator is an edge in a graph, where vertices are target types. If we
put the hierarchy aside, then, for each property set in the expanded build
request, we find the shortest path from target type of each source to and of
types that the top-level rule accepts. While searching, we ignore edges which
requirements are not subset of current property set.
There is a question of what happens is a single action generates several
targets. I don't know the right answer yet.
> Conversely, I don't like the idea of generating multiple build variants
> just because the toolsets support a certain kind of variation. Normally, a
> user will just want a release build, and a developer will just want a debug
> build. One build.
I tend to agree now.
> I think it's reasonable to say this:
> * the user gives us a set of property combinations (s)he wants
> * we do some magic to expand the specified properties. This may include:
> * expansion of compound properties
> * executing rules associated with executable properties
> * There is a procedure for running generators based on each property set.
> This procedure should support a kind of overload resolution which allows us
> to control precedence based on specificity. This part is complicated and
> I'd rather have a simpler alternative, but I don't see one now.
> * generators are run to build the dependency graph
What bothers me about your suggestion is that it seems to attempt to guess
all the build variants by looking only at top-level target, before finding
transformation. How can we account for the fact that some intermediate
transformation depends on value of some feature? How can we handle the
possibility that intermediate targets can have fewer relevant properties than
top-level target. I don't necessary mean that your approach precludes it, but
would like to see a way to address those questions.
> > ii) Let's start will the following question: is it okay to have a
> > semantic
> > where:
> > - each derived target type initially inherits all the transformation
> > that it's parent has
> What does it mean for a target type to "have" a transformation. Can you
> make this concrete for me?
It means that the tranformation has a type as either source or target type.
Lately I considered a possibility considering first pathes from source to
target types, and then (if no path is found), to walk up&down the hierarchy,
and searching pathes between target types which are not exact match. We'd
need to discuss your proposal before turning to this one, though (I find it
slightly more complicated that it should be)
Boost-Build list run by bdawes at acm.org, david.abrahams at rcn.com, gregod at cs.rpi.edu, cpdaniel at pacbell.net, john at johnmaddock.co.uk