Boost logo

Boost :

From: Cem Bassoy (cem.bassoy_at_[hidden])
Date: 2021-01-07 21:31:36

Am Do., 7. Jan. 2021 um 04:06 Uhr schrieb Sergei Marchenko <

> > Please consider to use and contribute to Boost.uBlas
> <> which recently added tensor data
> types and operations with the convenient Einstein notation :
> > tensor_t C = C + A(_i,_j,_k)*B(_j,_l,_i,_m) + 5;
> Thank you Cem for the suggestion! uBlas::opencl definitely looks
> interesting, since many basic NN layers can be implemented using various
> element-wise functions, and the hardware support that comes with it is very
> appealing. The Einstein tensor notation is convenient for multi-dimensional
> convolution and pooling layers, although I feel that C++ 17 requirement for
> tensor extension is probably too strong. I will need to experiment with the
> library a bit more to get a better sense of what it means to implement NN
> abstractions on top of it.

Sure. Just let me know if you need help.
The contraction is not optimized. If you need optimized versions, please
let me know - we are working on it right now.
We are preparing faster implementations for Tensor-Times-Vector and
Tensor-Times-Matrix. (E.g.

> Best regards,
> Sergei Marchenko.


Boost list run by bdawes at, gregod at, cpdaniel at, john at