Boost logo

Boost :

Subject: [boost] Supporting DNNs with Tensors/Multidimensional Arrays
From: Cem Bassoy (cem.bassoy_at_[hidden])
Date: 2018-08-29 13:47:45


Gsoc <https://summerofcode.withgoogle.com/organizations/4507228564881408/>
2018 just ended one week ago and we had many successefully completed student
projects <https://github.com/BoostGSoC18>.

I was responsible for adding tensor support to Boost.uBLAS for primarily
supporting multilinear algebra operations in the field of numerics. The
wiki description along with the implementation can be found here
<https://github.com/BoostGSoC18/tensor>.

Similar to Boost.multi_array
<https://www.boost.org/doc/libs/1_68_0/libs/multi_array/doc/index.html> the
runtime-reshapable tensor data structure is parametrized in terms of number
of dimensions (rank/order), dimension extents, data type, layout (first-
and last-order) and storage type. The first two are runtime-variable. I am
also about to add subtensor (view/handle of a tensor) along with
multidimensional iterators for convenient algorithm implementation.

It is yet not as flexible as GSL's multi_span
<https://github.com/Microsoft/GSL/blob/master/include/gsl/multi_span>: does
not yet support static rank and dimensions. However, basic generic tensor
operations (contraction/transposition/reshaping/...), including a nice
syntax for Einstein's summation convention with placeholders, using C++17
features are provided. The operations are evaluated using expression
templates (not smart yet).

Similar to the tensor
<https://eigen.tuxfamily.org/dox/unsupported/group__CXX11__Tensor__Module.html>
framework of Eigen, that is used by tensor flow
<https://github.com/tensorflow/tensorflow>, the tensor data structure in
Boost.uBlas could be taken for implementing deep neural networks or
higher-order statistics I think. I am not sure if the C++ community would
appreciate if Boost has some form of basic operations for building *deep
neural networks* (DNNs). I would like to ask

1. if it make sense for boost to support basic operations for DNNs?
2. what are the obligatory, necessary basic operations for creating DNN
building blocks?
3. if there are any additional data structure parameters that needs to be
added for (efficiently) supporting DNNs?

Cem


Boost list run by bdawes at acm.org, gregod at cs.rpi.edu, cpdaniel at pacbell.net, john at johnmaddock.co.uk