# Boost :

From: walter_at_[hidden]
Date: 2001-11-23 13:13:11

--- In boost_at_y <mailto:boost_at_y>..., Peter Schmitteckert (boost) <
boost_at_s <mailto:boost_at_s>...> wrote:
> Hello,
>
> On Wednesday 21 November 2001 11:58, walter_at_g <mailto:walter_at_g>...
wrote:
>
> > You've lost me. Could you please explain or give a reference?

Thanks.

> ------------------------------------------------
> Outline of my most cpu intensive part
> ------------------------------------------------
>
> In my application I have to iterativly diagonalize
> large sparse matrices which involve the Matrix-Vector
> product of a matrix C with a vector x.
> To improve performance one can use a representation
> of C where the vector space of C is a tensor product of two
> vector spaces V and W.
> C is now a sum of operators A_i \otimes B_i, where A_i (B_i) acts on
> V (W) only. The total dimension of C is equal the product of the
> dimensions of V and W. A basis of the vector space of the product
space of
> V and W can be represented by a dyadic product of basis states of V
and W,
> i.e. if v * w^T.

I guess, this is a rather domain specific vector space decomposition
(if dim (C) is prime for example, the decomposition seems to be
useless).

> This representation has the advantage that the (sparse) matrix
vector
> multiplication can be represented by BLAS-3 operations instead of
BLAS-2
> operations using dense matrices.

Ok.

> In case you're not lost again, The spaces V and W themselves can
> be represented by by direct sums of sub-spaces V_l (W_k).
> The matrices A can now be represente by blocks A_lm, where
> A_lm is a mapping from V_l to V_m.

I'll believe that ;-)

> If you're still interested in details you may look at my thesis,

> chapter 5.5 .

I guess, your list of matrices conceptually can be viewed as tensor
or as (domain specific?) special sparse matrix format. But may be,
that's a matter of taste.

Regards

Joerg