Boost logo

Boost :

Subject: [boost] An interest in a neural net library?
From: Sergei Marchenko (serge_v_m_at_[hidden])
Date: 2019-02-09 23:03:34

Hi everyone.

I would like to get your opinion if there is any interest for a library that implements simple neural networks trained via backpropagation. The main idea is to define the configuration of a feed-forward multi-layer network via template parameters, and also provide implementations for a couple of well-known variations of networks (i.e. convolutional, sampling, network ensemble etc).

// Define a network ensemble with two networks with the same input and output size,
// but different number of neuron layers and different number of neurons in each layer.

typedef neural_network<100, 50, 10> network_1;

typedef neural_network<100, 70, 20, 10> network_2;

auto network = ensemble(network_1(), network_2());

// Train the network from a training set.

// item.input is a 100-element input vector.

// is the 10-element desired output vector.

for (auto item : training_set)

    network.train(item.input,, learning_rate);

// Feed some input data into the network to get the result.
auto result = network.process(input);

This code on github that my nephew and myself wrote a couple of years ago can be used as a more detailed illustration for the idea:

Also, this github repo has two sample projects that illustrate usage of the library.

First sample trains a simple network with a simulated activity pattern and then uses the network to classify more simulated load as low, medium, or high:

Second sample trains a network on a part of MNIST data set of hand-written digits, and then uses the network to recognize the remaining part of the MNIST data set:

Thank you,
Serguei Martchenko

Boost list run by bdawes at, gregod at, cpdaniel at, john at