My 2 cents for what it is worth.

My project started using uBLAS not for its linear algebra capabilities per se, But rather to create fast matrix and vector operations for modeling and signal processing. In other words, a Matlab-like framework for C++.

Long time users of Matlab are used to the fact that it runs much, much fast when you use matrix operations (even element by element ones like ./ and .*) than it does when you do those same calculations in a loop. I saw the uBLAS' utilization of template meta programming as holding the same optimization potential for C++.

Given my druthers, I'd like to see uBLAS re-focus providing a fast computational framework and not try to become everything for everyone (ex: let someone else do the distributed part). Having said that, I think a few things might help to that end.

My project started using uBLAS not for its linear algebra capabilities per se, But rather to create fast matrix and vector operations for modeling and signal processing. In other words, a Matlab-like framework for C++.

Long time users of Matlab are used to the fact that it runs much, much fast when you use matrix operations (even element by element ones like ./ and .*) than it does when you do those same calculations in a loop. I saw the uBLAS' utilization of template meta programming as holding the same optimization potential for C++.

Given my druthers, I'd like to see uBLAS re-focus providing a fast computational framework and not try to become everything for everyone (ex: let someone else do the distributed part). Having said that, I think a few things might help to that end.

- Unifying matrix and vector so that people adding functionality don't have to do it twice.
- A library of math.h algorithms implemented as uBLAS functions. I wrote one for double, float, complex<double>, and complex<float> that I'd be happy yo contribute.
- An ability to do fast indexing like Matlab's find() function and x(a) where x is a matrix/vector and a is a matrix/vector. My goal would be to create interpolation routines that took a matrix/vectors of double/floats as an argument. Step #1 is to find the neighborhood of the arguments on the axes. For this, you'd want it to return a matrix/vector of unsigned integers.

The long term goal would be to not only linear algebra, but also coordinate rotations, statistics, filtering, interpolation, Fourier transforms, derivatives, etc. In my vision, many of these other features could be provided by people outside of the core uBLAS team. I'd be very interest to see what the ViennaCL people are up to in their effort to support GPU processors using uBLAS syntax.

Sean Reilly

On Mon, Dec 9, 2013 at 7:22 AM, Rutger ter Borg <rutger@terborg.net> wrote:

On 2013-12-09 11:18, sguazt wrote:If there is enough interest to integrate (parts of) the numeric bindings library, I'm willing to help.

+1 for a library with loosely-coupled feature, even if this means a

completely rewriting of uBLAS

For the integration of computational kernels, why don't we use the

boost-numeric_bindings lib?

Also, I'd like to have more MATLAB-like functionalities (I've

implemented some of them in boost-ublasx, but they relies on

boost-numeric_bindings and LAPACK functions)

Cheers,

RutgerSent to: campreilly@gmail.com

_______________________________________________

ublas mailing list

ublas@lists.boost.org

http://lists.boost.org/mailman/listinfo.cgi/ublas