Boost logo

Boost :

Subject: Re: [boost] Boost Cmake Modules
From: Niall Douglas (s_sourceforge_at_[hidden])
Date: 2017-01-13 03:34:50


>> * Autodiscovers any tests you have and sets them up with ctest
>
> This is interesting. But I don't think it scales. Some tests require
> linking in multiple sources, or require certain flags to be enabled. I
> really don't see how to autodiscover tests that will work for most
> boost libraries. Perhaps a `bcm_auto_test` function could be called by
> the author to do that(I would like to know what kind of conventions you
> follow when discovering the tests), and for libraries that have more
> complicated testing infastructure and add the tests manually with
> `bcm_add_test`.

The way it scales is that it makes use of directory structure. So if you
fire .cpp files into /test, each is considered a ctest target, but if
you put them into /test/somedir they get new semantics. In particular,
you'll see Boost.AFIO v2 uses a /test structure which says to use
Boost.KernelTest which is a new meta-test infrastructure.

>> * Provides out of the box CI ctest scripting which uploads results to
>> a
>> CDash for your project and updates your github website with the
>> output
>> from doxygen
>
> This is probably useful for libraries that use doxygen. Using sphinx or
> mkdocs, I don't need to push changes out to github, as ReadTheDocs will
> update the documentation on push. Plus it will store the documentation
> for different tagged versions as well. This scales much nicer than
> using github pages.

I don't anybody who has ever used doxygen for anything serious has ever
been happy with it. The good folks over at DoxyPress did an amazing job
at refactoring doxygen, but in the end the fundamental design is just broke.

Problem is, and I think most would also agree here, there isn't anything
better than doxygen for C++ reference docs. ReadTheDocs + Breathe
generates what I find to be unusable reference docs. Formatting which
suits Python well suits C++ terribly.

Many years ago Stefan (along with Dave Abrahams) championed a new C++
docs tool which was much better than doxygen, but in the end the effort
required to finish it proved difficult to make happen. I'm sure most
would agree what a shame.

If anybody knows of a tool which can understand doxygen markup but
generates much better reference docs, I would be *extremely* interested.
The really key part is that new C++ docs tooling *needs* to grok doxygen
markup. So many new tools don't, and therefore get no traction
because so many C++ codebases are locked into doxygen markup.

>> * Automatically matches git SHA in dependent git subrepos in flat
>> dependency configurations
>
> I am not a fan of git submodules, as it breaks downloading the source
> tarball files from github.

That's a long standing bug on github. And the fault of github, not of
anyone else. I really wish they let you disable the tarball download and
let you supply your own tarball URL. The current broken system is
very confusing for users.

>> * Automatically merges any develop commit passing all tests on all
>> platforms according to CDash into master branch
>> * Automatically packages up your library and publishes it to tarball,
>> vcpkg (with ubuntu launchpad and homebrew in progress right now)
>
> Adding support for CPack to create tarballs, debian, and fedora
> packages would be nice to add. However, mapping dependecies names
> between different package managers can be handled through convention
> for boost-only libraries, however, external dependencies(such as zlib)
> is not so easy.

I bailed out on that question and simply have each boost-lite library
maintain the metadata for each package repo. i.e. it's the long way round.

> Also, as the library would support standard cmake install flow, it can
> easily be installed with cget(and dependencies can be installed with a
> requirements.txt file). I find this flow preferable over trying to
> update system-level package managers like homebrew or vcpkg. Although,
> from what I've seen from vcpkg, it works very similiar to cget except
> it is windows-centric.

I'd call cget an external tool dependency personally. I certainly had
never heard of it before you mentioning it, and I would have no idea how
to install it on Windows. I am assuming it is this:
https://linux.die.net/man/1/cget

I think this stuff comes back to David Sankel's notion of libraries
being anti-social. If you're anti-social, you force library users up
this hill of preconfig and build just to test out your library. Bjarne's
been railing against that for years, and it is one of his biggest
bugbears with Boost, yet trying out Boost on all platforms except
Windows is a simple install from that platform's package repos and
therefore is a very low hill to climb. I'm therefore fond of package
repositories, end users like them too.

>> * Libraries based on this are 100% standalone, when you clone the git
>> repo or unpack the tarball you are 100% ready to go. Nothing else
>> needed, not even configure and build. No arcane command line programs
>> to
>> run.
>
> I don't understand this. My focus of these modules is to support the
> standard configure, build and install flow in cmake. Trying to hack
> cmake with a different conventional flow seems problematic. If users
> don't like this flow, or scared of typing, then external tools can be
> created to automate this. However, creating a different flow in cmake
> will just cause a dissonance with other cmake libraries.

Sorry you misunderstood me. What I meant above is that the cmake is
ready to go. You don't need to run cmake generators, or run some python
master cmake control script etc. The libraries themselves are header
only currently, but sometime this year I'm going to write a preprocessor
stage for cmake which will have cmake at dev time convert a header only
library which does preprocessor metaprogramming like Outcome does into a
single large preexpanded include file. That should reduce the gap
between C++ Module include times and non-C++ Module include times for
users, plus it means I can provide an easy playpen on gcc.godbolt etc.

>> I wouldn't recommend that anyone else use it yet. It is very much a
>> work
>> in progress, but all the above is working, and you can see it in
>> action
>> in proposed Boost.Outcome. It also has nil documentation.
>
> So I tried to install your Boost.Outcome library with no luck. First, I
> did `cget install ned14/boost.outcome`. And that didn't work because of
> missing the git submodules. So, I cloned it locally with its
> submodules, and then did `cget install boost.outcome`. It still didn't
> work.

I'm not sure about this cget tool, but cmake --build . --target install
should work on all platforms after you've done a *recursive* git
submodule checkout. By "should" I mean install is not being CI tested
yet, and it could be broken after some changes I did earlier this week
so caveat emptor.

A lot of people only check out the first layer of git submodules. It's
very important it's recursive as the recursion goes deep. Failing that
just use the prebuilt tarball at
https://dedi4.nedprod.com/static/files/boost.outcome-v1.0-source-latest.tar.xz.
That tarball is what passed the tests on the CI, so it's complete and
verified working.

> However, it looks like you have done a lot of awesome work here, and it
> would be great to integrate those into the cmake modules so other boost
> libraries could take advantage of it.

A lot of the cool stuff mentioned above is because this build system
imposes very strict orthodoxy on libraries using it. It is a classic
cathedral design, there is exactly one way of doing things and libraries
get no choice. This has the big advantage of all client libraries
getting all the cool stuff for free plus stuff being added to them after
without needing to change them, but it also has the big disadvantage
that any design mistake is a showstopper for all which cannot be worked
around easily. In other words, if I didn't think of some build use case,
or if I didn't think that build use case important enough to support,
you are hosed.

That's why I don't recommend anyone else use it until I've fixed more of
the systemic design flaws, and then put a year of maturity on it where
it isn't being constantly chopped and changed. Only then would I
recommend anyone else use it and indeed you'll probably see a website go
up and an announcement here of a new collection of C++ standards
aspiring libraries to complement Boost (be aware this is likely 2020 or
later at current rates of progress).

Niall

-- 
ned Productions Limited Consulting
http://www.nedproductions.biz/ http://ie.linkedin.com/in/nialldouglas/

Boost list run by bdawes at acm.org, gregod at cs.rpi.edu, cpdaniel at pacbell.net, john at johnmaddock.co.uk