Subject: Re: [boost] Supporting DNNs with Tensors/Multidimensional Arrays
From: Cem Bassoy (cem.bassoy_at_[hidden])
Date: 2018-08-31 08:41:32
Am Do., 30. Aug. 2018 um 20:17 Uhr schrieb Bjorn Reese via Boost <
> On 08/29/18 15:47, Cem Bassoy via Boost wrote:
> > 2. what are the obligatory, necessary basic operations for creating DNN
> > building blocks?
> You may want to investigate Automatic Differentiation, which is a
> building block that extends to many use cases besides DNN.
I am not really sure if. I have skimmed his paper
https://arxiv.org/pdf/1804.00746.pdf. I need more concrete things, I guess
before reasoning about DNNs. I thought to implement some Eigen::Tensor
Operations such as Broadcast(), Convolution(), etc.
However, I need some input from the boost community in what direction we
want to go and if we want to enhance the support for multidimensional
arrays, tensors, etc. We are now behind Eigen::Tensor and some other
Boost list run by bdawes at acm.org, gregod at cs.rpi.edu, cpdaniel at pacbell.net, john at johnmaddock.co.uk