|
Boost-Build : |
From: Reece Dunn (msclrhd_at_[hidden])
Date: 2005-09-15 03:12:43
ANDREW MARLOW, BLOOMBERG/ LONDON OF wrote:
> ----- Original Message -----
> From: Reece Dunn <jamboost_at_[hidden]>
> NOTE: Since BBv2 is doing #include scanning, it should be possible to
> automatically scan for <pch-file> in the top-level source file and if
> found, enable PCHs. This would be much simpler to use and would be the
> preferred way to go,
>
> AM> One has to be careful here. The presence of such a #include stmt
> could easily be because the code is multi-platform and takes
> advantage of pch where possible. For example, s/w that is
> developed for Windoze and Unix may contain such #include stmts,
> but typically there is no pch facility on unix. The pch header file
> would contain #ifdef's to make the file a no-op where the
> facility is not supported.
Agreed, but compilers that do not support PCH would not use the PCH
information found by the include scanning. The problem becomes if the
PCH detection in the build system goes out of step with that of the
#ifdef detection.
For example, if you say that when building on unix don't use PCH and
then compile for unix with a GCC compiler that supports PCH, the build
system and compiler are going to think conflicting things regarding PCH
usage.
It may be a good idea for BBv2 to provide a <define>BOOST_HAS_PCH so
that you can use the #ifdef's correctly.
Having said that, you don't need to make a PCH a no-op for targets that
don't support them as the compiler will still need to process the
headers inside them. It's just that compilers that support PCHs will
load the cached AST/internal representation instead of running it
through the lexer/parser again (as compilers that don't support PCHs would).
Making a PCH a no-op for compilers that don't support them would make a
build fail.
- Reece
Boost-Build list run by bdawes at acm.org, david.abrahams at rcn.com, gregod at cs.rpi.edu, cpdaniel at pacbell.net, john at johnmaddock.co.uk