My 2 cents for what it is worth.

My project started using uBLAS not for its linear algebra capabilities per se, But rather to create fast matrix and vector operations for modeling and signal processing.  In other words, a Matlab-like framework for C++.  

Long time users of Matlab are used to the fact that it runs much, much fast when you use matrix operations (even element by element ones like ./ and .*) than it does when you do those same calculations in a loop.  I saw the uBLAS' utilization of template meta programming as holding the same optimization potential for C++.

Given my druthers, I'd like to see uBLAS re-focus providing a fast computational framework and not try to become everything for everyone (ex: let someone else do the distributed part).  Having said that, I think a few things might help to that end.
The long term goal would be to not only linear algebra, but also coordinate rotations, statistics, filtering, interpolation, Fourier transforms, derivatives, etc.  In my vision, many of these other features could be provided by people outside of the core uBLAS team.  I'd be very interest to see what the ViennaCL people are up to in their effort to support GPU processors using uBLAS syntax.

Sean Reilly

On Mon, Dec 9, 2013 at 7:22 AM, Rutger ter Borg <> wrote:
On 2013-12-09 11:18, sguazt wrote:

+1 for a library with loosely-coupled feature, even if this means a
completely rewriting of uBLAS

For the integration of computational kernels, why don't we use the
boost-numeric_bindings lib?

Also, I'd like to have more MATLAB-like functionalities (I've
implemented some of them in boost-ublasx, but they relies on
boost-numeric_bindings and LAPACK functions)

If there is enough interest to integrate (parts of) the numeric bindings library, I'm willing to help.



ublas mailing list
Sent to: