Boost logo

Boost :

From: Ronald Garcia (rgarcia4_at_[hidden])
Date: 2001-03-13 11:05:22


>> However, I also would prefer to be able to configure the
>> library such that internally BLAS is used. This way we combine
>> the convenience of C++ without wasting all the effort already
>> done on optimising BLAS.

    jl> I agree. Besides, I understand that BLAS is the key point of
    jl> the LAPACK efficiency. The intent is to isolate the
    jl> architecture dependant optimizations in BLAS (which are very
    jl> likely to be non portable).

I don't agree that wrapping BLAS is the correct route to go. For
starters, let's consider numeric types. BLAS currently provides
support for some of the basic types available in C++. But supposing I
wish to use Boost.Rational in a matrix. Currently I don't see a good
way of achieving this. BLAS operations are duplicated over and over
again for many different numeric types. This is why there exists
dgemm (double general matrix multiply), sgemm (float general matrix
multiply), etc. for EACH supported numeric type. This screams for
templates. As far as optimization, prior work has shown that
libraries can be tuned portably for numerics. A starting point for
this is the ATLAS project (Automatically-tuned linear algebra
subroutines).

    jl> The work already done on LAPACK is also huge. I mean HUGE:
    jl> we're talking of 10 years of work; and the library is actually
    jl> quite well designed once you get familiar with their cryptic
    jl> naming scheme.
...and you resign yourself to the architectural limitations of the
language in which it is written. Having spent several months fighting
with Bsome LAS/Lapack code and porting some to C++/MTL, I can honestly
say I don't ever want to write a line of Fortran again if I can help
it. And I think we can reasonably avoid that route.

ron


Boost list run by bdawes at acm.org, gregod at cs.rpi.edu, cpdaniel at pacbell.net, john at johnmaddock.co.uk