|
Boost : |
From: David Abrahams (dave_at_[hidden])
Date: 2004-07-02 15:23:47
Michael Stevens <Michael.Stevens_at_[hidden]> writes:
> Hi Dave, hi all,
>
> I read with interest you plans for reviving MTL. It is certainly a shame that
> development of MTL was been stagnant for so long. It certainly had many
> interesting design features and I used it for all my work for many years.
> That said I switched to using and developing uBLAS long ago and it has an
> excellent design.
Jeremy and I have just completed a re-evaluation of uBlas based on
what's in uBlas' own CVS repository, having not discovered that until
recently (you should have that info on uBlas' Boost page!) We have
some major complaints with uBlas' design. The list is long, and the
issues run deep enough that we don't believe that uBlas is a suitable
foundation for the work we want to do.
Here is a partial list of things we take issue with:
Interface Design
----------------
* Not grounded in Generic Programming. The concept taxonomies, to the
extent they exist, are weak, and poorly/incorrectly documented.
Aspects of the design that should be generic are not (e.g. only
certain storage containers are supported, rather than supplying
storage concept requirements). No linear algebra concepts (vector
space, field, etc.) The library is so out-of-conformance with our
expectations for Generic Programming that this one item by itself
is probably enough to make it unsuitable for us.
* Redundant specification of element type in matrix/vector storage.
* size1 and size2 should be named num_rows and num_columns or
something memnonic
* iterator1 and iterator2 should be named column_iterator and
row_iterator or something memnonic.
* prod should be named operator*; this is a linear algebra library
after all.
* begin() and end() should never violate O(1) complexity expectations.
* insert(i,x) and erase(i) names used inconsistently with standard library.
* Matrix/Vector concept/class interfaces are way too "fat" and need to
be minimized (e.g. rbegin/rend *member* functions should be
eliminated).
* The slice interface is wrong; stride should come last and be
optional; 2nd argument should be end and not size; then a separate
range interface could be eliminated.
* No support for unorderd sparse formats -- it can't be made to fit
into the uBlas framework.
Implementation
--------------
* Expressions that require temporaries are not supported by uBLAS
under release mode. They are supported under debug mode. For
example, the following program compiles under debug mode, but not
under release mode.
#include <boost/numeric/ublas/matrix.hpp>
#include <boost/numeric/ublas/io.hpp>
int main () {
using namespace boost::numeric::ublas;
matrix<double> m (3, 3);
vector<double> v (3);
for (unsigned i = 0; i < std::min (m.size1 (), v.size ()); ++ i) {
for (unsigned j = 0; j < m.size2 (); ++ j)
m (i, j) = 3 * i + j;
v (i) = i;
}
std::cout << prod (prod(v, m), m) << std::endl;
}
The workaround to make it compile under release mode is to
explicitly insert the creation of a temporary:
std::cout << prod (vector<double>(prod(v, m)), m) << std::endl;
There should be no such surprises when moving from debug to
release. Debug mode should use expression templates, too, as the
differences can cause other surprises.
* Should use iterator_adaptor. There is a ton of boilerplate iterator
code in the uBLAS that needs to be deleted.
* Should use enable_if instead of CRTP to implement operators. uBLAS
avoids the ambiguity problem by only using operator* for
vector-scalar, matrix-scalar ops, but that's only a partial
solution. Its expressions can't interact with objects from other
libraries (e.g. multi-array) because they require the intrusive CRTP
base class.
* Certain operations, especially on sparse matrices and vectors, and
when dealing with matrix_row and matrix_column proxies have the
wrong complexity. Functions such as begin() and end() are suppose
to be constant time. I see calls to find1 and find2, which look like
expensive functions (they each contain a loop).
Testing
-------
* There should be a readme describing the organization of the tests.
* Tests should not print stuff that must be manually inspected for
correctness.
* Test programs should instead either complete successfully
(with exit code 0) or not (and print why it failed).
* Abstraction penalty tests need to compare the library with tuned
fortran BLAS and ATLAS, not naive 'C' implementation.
Documentation
-------------
* In really bad shape. Redundant boilerplate tables make the head
spin rather than providing useful information.
* Needs to be user-centered, not implementation centered.
* Need a good set of tutorials.
Compilers
---------
* Simple example doesn't compile with gcc 3.3. Got it to compile by
#if'ing out operator value_type() in vector_expression.hpp.
> On Saturday 05 June 2004 13:53, David Abrahams wrote:
>> > What do you plan for MTL? How is it different than ublas?
>>
>> MTL is aimed at linear algebra, whereas IIUC ublas is not.
>
> Well the L and A in uBLAS certainly stand for Linear Algebra! Of course the B
> stands for Basic and uBLAS's primary aim is to provide the standard set of
> BLAS functions in a modern C++ environment. Of course as it stands the
> complete uBLAS library is more then just the BLAS functions and includes some
> common Linear Algebra algorithms and many useful types.
Whoops. Of course that's correct.
> That said I think it is important to separate BLAS functions from
> domain specific linear algebra algorithm development. This is
> something that proved itself since the seventies.
>
>> There's a lot more to what's in the current plan than I can lay out
>> here, but the focus will be on support for different kinds of matrices
>> with combinations of these aspects
>>
>> o Shape
>> o Orientation
>> o Symmetry
>> o Sparsity
>> o Blocking
>> o Upper / Lower storage
>> o Unit diagonal
>
> Other then the many forms of blocking (other then banded) uBLAS supports all
> these in its design.
I believe that a design with really good support for blocking can't be
easily grafted onto an existing design that doesn't have it.
> This really is its strength! To a large extent they can even be
> combine these properties where it makes mathematical sense. For
> example you can wrap up one of a number of sparse matrix types in a
> symmetric adaptor.
This stuff was all present in MTL2 IIRC.
>> and operations like:
>>
>> o scalar-vector (vector-scalar) multiplication
>> o vector addition (and subtraction)
>> o apply linear operator (left)
>> o norm
>> o inner product
>> o triangular solve
> Other then 'apply linear operator' these are all in uBLAS!
>
>> with expression templates, and of course, zero abstraction penalty ;-)
> Of course uBLAS does this all with ET, but the abstraction penalty may not be
> zero :-)
>
> Other then the lack of ET in the current MTL the big difference
> between the two libraries is the definition of iterators. Neither
> design seems to be perfect with regard to efficiency.
No, and I have some ideas for addressing that.
> Since uBLAS is already in Boost and has a well established and clean user
> syntax it would seem strange to ignore it.
Yeah, I stopped ignoring it long enough to determine for sure that we
should probably ignore it :(.
> For the perspective of building further Linear Algebra algorithms it
> would not be too hard to use the syntax sufficiently portably so
> that a future MTL with expression templates could not be used
> interchangeably.
We have some problems with the syntax too, as you can see from the
above. That said, if the design of MTL makes sparing use of members
and instead relies on free functions, you should be able to make
uBlas syntax adapters ;-)
-- Dave Abrahams Boost Consulting http://www.boost-consulting.com
Boost list run by bdawes at acm.org, gregod at cs.rpi.edu, cpdaniel at pacbell.net, john at johnmaddock.co.uk