<div dir="ltr">My 2 cents for what it is worth.<br><br>My project started using uBLAS not for its linear algebra capabilities per se, But rather to create fast matrix and vector operations for modeling and signal processing. �In other words, a Matlab-like framework for C++. �<br>
<br>Long time users of Matlab are used to the fact that it runs much, much fast when you use matrix operations (even element by element ones like ./ and .*) than it does when you do those same calculations in a loop. �I saw the uBLAS&#39; utilization of template meta programming as holding the same optimization potential for C++.<br>
<br>Given my druthers, I&#39;d like to see uBLAS re-focus providing a fast computational framework and not try to become everything for everyone (ex: let someone else do the distributed part). �Having said that, I think a few things might help to that end.<div>
<ul><li>Unifying matrix and vector so that people adding functionality don&#39;t have to do it twice.</li><li>A library of math.h algorithms implemented as uBLAS functions. �I wrote one for double, float, complex&lt;double&gt;, and complex&lt;float&gt; that I&#39;d be happy yo contribute.</li>
<li>An ability to do fast indexing like Matlab&#39;s find() function and x(a) where x is a matrix/vector and a is a matrix/vector. �My goal would be to create interpolation routines that took a matrix/vectors of double/floats as an argument. �Step #1 is to find the neighborhood of the arguments on the axes. �For this, you&#39;d want it to return a�matrix/vector of unsigned integers.</li>
</ul><div>The long term goal would be to not only linear algebra, but also coordinate rotations, statistics, filtering, interpolation, Fourier transforms, derivatives, etc. �In my vision, many of these other features could be provided by people outside of the core uBLAS team. �I&#39;d be very interest to see what the ViennaCL people are up to in their effort to support GPU processors using uBLAS syntax.</div>
</div><div><br></div><div>Sean Reilly</div></div><div class="gmail_extra"><br><br><div class="gmail_quote">On Mon, Dec 9, 2013 at 7:22 AM, Rutger ter Borg <span dir="ltr">&lt;<a href="mailto:rutger@terborg.net" target="_blank">rutger@terborg.net</a>&gt;</span> wrote:<br>
<blockquote class="gmail_quote" style="margin:0 0 0 .8ex;border-left:1px #ccc solid;padding-left:1ex"><div class="im">On 2013-12-09 11:18, sguazt wrote:<br>
<blockquote class="gmail_quote" style="margin:0 0 0 .8ex;border-left:1px #ccc solid;padding-left:1ex">
<br>
<br>
+1 for a library with loosely-coupled feature, even if this means a<br>
completely rewriting of uBLAS<br>
<br>
For the integration of computational kernels, why don&#39;t we use the<br>
boost-numeric_bindings lib?<br>
<br>
Also, I&#39;d like to have more MATLAB-like functionalities (I&#39;ve<br>
implemented some of them in boost-ublasx, but they relies on<br>
boost-numeric_bindings and LAPACK functions)<br>
<br>
</blockquote>
<br></div>
If there is enough interest to integrate (parts of) the numeric bindings library, I&#39;m willing to help.<br>
<br>
Cheers,<br>
<br>
Rutger<div class="im"><br>
<br>
<br>
<br>
______________________________<u></u>_________________<br>
ublas mailing list<br>
<a href="mailto:ublas@lists.boost.org" target="_blank">ublas@lists.boost.org</a><br>
<a href="http://lists.boost.org/mailman/listinfo.cgi/ublas" target="_blank">http://lists.boost.org/<u></u>mailman/listinfo.cgi/ublas</a><br></div>
Sent to: <a href="mailto:campreilly@gmail.com" target="_blank">campreilly@gmail.com</a><br>
</blockquote></div><br></div>