Boost logo

Boost-Build :

Subject: Re: [Boost-build] RFC: Boost.Build Python Prototype
From: Stefan Seefeld (stefan_at_[hidden])
Date: 2016-11-15 15:07:34


On 15.11.2016 14:49, Vladimir Prus wrote:
> Stefan,
>
> On 15-Nov-16 6:45 PM, Stefan Seefeld wrote:
>
>>> Also, "b3 --tool=gcc:gcc & b3 --tool=cc:clang" might work if the
>>> decision what variants to build is done by user. In case of Boost,
>>> the decision which variants to build is done inside top-level Jamroot,
>>> using various properties include the toolset. I don't suppose it
>>> would be good if users were required to use non-portable shell
>>> scripting
>>> for that.
>>
>> I don't understand the point you are making. The question is not what
>> syntax to use to convey the request to build two variants, the question
>> is where the decision is made.
>> What I understand from your description is that it's not always clear or
>> even possible to define multiple build variants to capture the building
>> of two versions of a library (for example), so it has to be possible to
>> do both in the same build process.
>
> I was trying to say that for Boost, the top level decides which variants
> to build, using not exactly trivial logic, so having functionality
> to generate one metatargets a few times is much better than
> 'just run b2 a few times' approach that you seem to have suggested.

OK, I now understand (and agree).

>>
>> So something like
>>
>> lib1 = library('mylib', sources=[...], link='static')
>> lib2 = library('mylib', sources=[...], link='shared')
>
> The duplication of sources here would violate DRY principle.
>
>> might work. Of perhaps even
>>
>> meta = library('mylib', sources=[...])
>> lib1 = meta.expand(link='static')
>> lib2 = meta.expand(link='shared')
>
> That is closer what both what I think the requirements are and what
> Boost.Build does. However, in Boost.Build the declaration of the
> metatarget and the code that requests that build need not be
> together, so you'd have to do something like:
>
> meta = lookup_metatarget('subdir/mylib')
> for props in [{'link', 'static'}, {'link': 'shared'}]:
> apply(meta.expand, [], props)

Excellent. I think that's quite close to what I already have. (There is
no 'lookup_metatarget' function, as I pass target objects around. But
adding a mechanism to look up objects by (qualified) name is trivial enough.

>>>>> - You also seem to not have any notion of portable build properties -
>>>>> instead each tool just accepts whatever it pleases. I can't see any
>>>>> support for automatically generating build directories based on
>>>>> properties. There does not seem to be any support for computing
>>>>> final
>>>>> build properties based on requested ones.
>>>>
>>>> You are right, that is still missing. Indeed, my putting together the
>>>> mechanism for communicating properties across targets felt quite
>>>> ad-hoc,
>>>> and I would like to consolidate that into something more formal. I'd
>>>> appreciate your help with that ! :-)
>>>> And for avoidance of doubt: I'm not at all against establishing build
>>>> directory naming conventions based on properties. I just don' want to
>>>> bake that into the tool itself, i.e. I'd rather make that something
>>>> that
>>>> can be "plugged in" per project.
>>>
>>> It is fine to customize that, but I think build variants that work out
>>> of box is also key part of Boost.Build appeal. Users should not have to
>>> configure this manually.
>>
>> I agree. But from b2p2's point of view everything above (including Boost
>> itself) is a 'user'. (While I haven't introduced a per-project config
>> file yet, I have been thinking about it. So that would be a good place
>> to define things such as build directory layout, default build variant,
>> etc.)
>
> So if one starts from scratch, there's no sensible build directory layout?

Right, because what is sensible depends a lot on context that a build
system will not have, at least not if it strives to be general-purpose.
But again, if we add a "customization layer" to reduce the level of
genericity, we can easily add such policies.

(This is also in part why I haven't established a pre-defined vocabulary
for features - a point you were critical about in a previous mail. I'm
all for adding a layer where the same "features" as in use in b2 now are
defined. But right now I don't believe these belong into the
foundational layer that my prototype has been focusing on so far.)

> Technically, there are tools that come with project generatators
> to fill in various defaults, but then providing sensible defaults
> with no configuration seems better.
>
>>> Generators are similar to your composite targets, except that they
>>> come with a selection merchanism. Where you declare 'library' function
>>> and must use it, Boost.Build declares a generator with target type of
>>> SHARED_LIB and source type of OBJ and requirements of <toolset>gcc,
>>> and then maybe another with requirements of <toolset>msvc.
>>> Then, when generating a 'lib' metatarget, an appropriate generator
>>> is selected and run, and can in turn use generators to convert its
>>> sources to OBJ type.
>>>
>>> In other words, you have this:
>>>
>>> from b3.targets.library import *
>>> hello = library('hello', ['hello.cc'])
>>>
>>> and
>>>
>>> def map_src_to_obj(s):
>>> return rule(splitext(s)[0]+'.o', s, cxx.compile,
>>> params=dict(link='static'))
>>>
>>> Which hardcodes specific 'library' and 'cxx.compile'. In Boost.Build,
>>> in both places we use a mechanism that uses a generator appropriate
>>> for the target type and properties, which sounds better than
>>> hardcoding enough - and has enough of tricky details to make reuse
>>> worthwhile.
>>
>> Right, and I fully agree with your approach there. Perhaps we can
>> outline the mechanism of generators so I can hook a placeholder into my
>> model ?
>> Are generators documented or discussed on a conceptual / abstract level
>> somewhere ?
>
> The file I've linked has fairly lengthy comment at the top that might
> be helpful.

Indeed, and I have started studying that. Do you have unit-tests or
something similar where I could see generator objects in action ?

>> Again, the code above was not meant to indicate how real composite
>> targets are to be written. It's meant to demonstrate that generating
>> targets on-the-fly is indeed possible with my model. I'd love to combine
>> this with a generic algorithm that can be used to generate these
>> internal target graphs.
>
> Could you clarify what you mean by "on-the-fly"?

I'm referring to the fact that at the point where the (meta-) target is
created it isn't known yet how to produce real targets from them (how to
"expand" the composite target, using my own model), so this expansion is
done late, once the requirement properties for the (meta-)target are
bound to specific values.

Thanks,
        Stefan

-- 
      ...ich hab' noch einen Koffer in Berlin...

Boost-Build list run by bdawes at acm.org, david.abrahams at rcn.com, gregod at cs.rpi.edu, cpdaniel at pacbell.net, john at johnmaddock.co.uk