Boost logo

Boost :

From: Cem Bassoy (cem.bassoy_at_[hidden])
Date: 2021-01-06 09:32:15


Please consider to use and contribute to *Boost.uBlas*
<https://github.com/boostorg/ublas> which recently added *tensor* data
types and operations with the convenient Einstein notation :

tensor_t C = C + A(_i,_j,_k)*B(_j,_l,_i,_m) + 5;

More information is available at
https://github.com/boostorg/ublas/wiki/Tensor.
We could add convolution and pooling functions into Boost.uBlas and provide
examples how to use them.

Feel free to contact me if you have any questions or also use
https://gitter.im/boostorg/ublas for detailed discussion.

Best
CB

Am Do., 31. Dez. 2020 um 01:45 Uhr schrieb Sergei Marchenko via Boost <
boost_at_[hidden]>:

> Hi everyone,
>
> I have a template-based library with several common types of layers which
> can be assembled into various neural networks:
> https://github.com/svm-git/NeuralNet, and I would love to get community
> feedback on the overall design, any issues or missing features, and how
> interesting a library like that would be in general.
>
> I've been watching the trends and research reports in the AI/ML space, and
> I feel that the recent announcements of the successful models for image
> classification, computer vision or natural language processing, push the
> focus towards very complex and computationally intensive networks. However,
> I think that the idea behind multi-layer networks is very powerful, and is
> applicable in many domains, where even a small and lightweight model can be
> used successfully. I also think that if developers have an access to a
> library of building blocks that allows them to train and run NN anywhere a
> C++ code can run, it may encourage a lot of good applications.
>
> In the current state, the library is fairly small and should be easy to
> review. It was built with two main goals in mind:
>
> * Provide a collection of building blocks that share a common
> interface which allows plug'n'play construction of more complex NNs.
> * Compile-time verification of the internal consistency of the
> network. I.e. if a layer's output size does not match the next layer's
> input, it is caught at the very early stage.
>
> Once it seems like there is some consensus on the core design and
> usefulness of such library, I am willing to do the work necessary to make
> the library consistent with Boost requirements for naming convention,
> folder structure, unit tests etc. The library relies on the C++ 11 language
> features and has a dependency on just a few STL components, so I think it
> should be straightforward to merge into Boost.
>
> Best regards,
> Sergei Marchenko.
>
> _______________________________________________
> Unsubscribe & other changes:
> http://lists.boost.org/mailman/listinfo.cgi/boost
>


Boost list run by bdawes at acm.org, gregod at cs.rpi.edu, cpdaniel at pacbell.net, john at johnmaddock.co.uk