Boost logo

Boost Users :

From: Dan Muller (yg-boost-users_at_[hidden])
Date: 2002-09-12 08:25:35


"Thomas Willhalm" <yg-boost-users_at_[hidden]> wrote in message
news:alpsjb$j20$1_at_main.gmane.org...
> Hello,
>
> I'm new to ublas, so it's easily possible that I overlooked something
> obvious. The first thing I tried to calculate was the following line:
>
> numerics::matrix<double> M(dimension,n);
> numerics::matrix<double> Covariance
> = numerics::prod(M,numerics::trans(M))/n;
>
> where dimension is 50 and n is 8000.
>
> Unfortunately. it turned out that the following "stupid" code actually
> performs better:
>
> numerics::matrix<double> Covariance(dimension,dimension);
> for (int i=0; i<dimension; ++i)
> for (int j=0; j<dimension; ++j) {
> Covariance(i,j)=0;
> for (int k=0; k<n; ++k)
> Covariance(i,j) += M(i,k)*M(j,k);
> Covariance(i,j) /= n;
> }
>
> What am I doing wrong?
>
> I'm using gcc 2.95.3 under Linux 2.4.18 on a mobile Pentium III, if it
> matters.
>

How much of a performance difference are you seeing? (I assume that you were
using an optimized build.) I expect that you pay *some* price for using
nicer abstractions.

I've been playing around with uBLAS for a couple of weeks, but haven't
really considered its performance yet. I've been re-learning my linear
algebra at the same time, so it's been slow slogging. :-)


Boost-users list run by williamkempf at hotmail.com, kalb at libertysoft.com, bjorn.karlsson at readsoft.com, gregod at cs.rpi.edu, wekempf at cox.net