Boost logo

Boost :

From: rwgk_at_[hidden]
Date: 2000-12-11 09:15:38

David wrote:
> I still have some doubt about whether you can really use it to
achieve the
> compilation independence you desire, since it didn't sound like you
> understood the constraints built into the C++ language.

If exporting & importing to/from_python function pointers does not
reduce the compilation dependencies, breaking up a huge module into
smaller components becomes more a question of dynamic linking versus
static linking. I believe the advantages of dynamic linking are clear.
Imagine you had to statically link Python and all the third-party
components you are using each time you want to add a little extension
module! The system we are working on will rival Python in complexity,
to say the least. Being able to organize this system in dynamically
linked components (with Python as the core) is essential.

Actually, it was not so much the compilation independence that
my proposal. There is another very important motivation that I failed
to mention until now.

Assume there a several groups of classes. In C++, each group lives in
a certain namespace. However, there are member functions of classes
in a given namespace that have parameters or return values that are
objects defined in another namespace. E.g.:

  namespace A {
    class A1 {};
  namespace B {
    class B1 { A::A1 foo(); };

It seems most natural to expose these classes to python in a way that
mirrors the layout in C++:

import A
obja = A.A1()
import B
objb = B.B1()
res = # res is A.A1!

Assume further that there are a few helper functions and data
definitions (e.g. constants) in each namespace. It is more than
a just a convenience if these items are available as A.helper(),
A.some_constant etc.

It would be nice if this concept would also work for nested
but this is probably asking too much (and probably not all that

Finally, there is an important psychological component. If a group of
classes is exposed in a huge module as a few of many, the authors will
not be as happy as they would be if their group of classes is exposed
by a module with a recognizable name. This is not just a question of
strong egos, but also of getting credit and funding!

Ruling out the option of statically linking with the Python core, the
Python "import <name>" statement requires a physical file <name>.*.
Therefore it seems to me that the only way to expose wrapped C++
classes as groups is to have a separate module for each group. Does
someone see an alternative?

If the symbols seen by the dynamic loader all live in the same name
space (as is the default under Tru64), the problem of inter-module
communication at the C++ level reduces to ensuring that
  1. at compile time: the to/from_python functions are declared.
  2. at run time: the necessary modules are imported before the
     corresponding to/from_python functions are used.

If the symbols seen by the dynamic loader do not all live in the same
scope (most systems; under Tru64 use the -hidden linker option), the
to/from_python functions can be implemented as wrappers that import
PyCObjects holding pointers to the actual converter functions. This
is both highly portable and, due to the use of dynamic loading (as
opposed to statically linking the individual shared libraries against
each other), very friendly to the end user.


Boost list run by bdawes at, gregod at, cpdaniel at, john at