|
Boost : |
From: Sergei Marchenko (serge_v_m_at_[hidden])
Date: 2021-01-07 03:06:22
> Please consider to use and contribute to Boost.uBlas<https://github.com/boostorg/ublas> which recently added tensor data types and operations with the convenient Einstein notation :
> tensor_t C = C + A(_i,_j,_k)*B(_j,_l,_i,_m) + 5;
Thank you Cem for the suggestion! uBlas::opencl definitely looks interesting, since many basic NN layers can be implemented using various element-wise functions, and the hardware support that comes with it is very appealing. The Einstein tensor notation is convenient for multi-dimensional convolution and pooling layers, although I feel that C++ 17 requirement for tensor extension is probably too strong. I will need to experiment with the library a bit more to get a better sense of what it means to implement NN abstractions on top of it.
Best regards,
Sergei Marchenko.
Boost list run by bdawes at acm.org, gregod at cs.rpi.edu, cpdaniel at pacbell.net, john at johnmaddock.co.uk