Boost logo

Boost :

Subject: Re: [boost] Interest for Algorithmic Adjoint Differentiation (AAD) in Boost?
From: Antoine Savine (antoine_at_[hidden])
Date: 2017-11-22 11:57:27

I just realized that the attachment exceeds (by far) the maximum size. With apologies (this is my first post), re-posting now without the attachment. The introduction can be found at the top level of the repository, under the name


Thank you.

Antoine Savine

From: Antoine Savine
Sent: 22 November 2017 12:38
To: 'boost_at_[hidden]' <boost_at_[hidden]>
Subject: Interest for Algorithmic Adjoint Differentiation (AAD) in Boost?

AAD is a hot topic in machine learning, computational finance, and, I think, other disciplines such as meteorology. Adjoint differentiation computes a large number of derivatives sensitivities for some calculation code in constant time, and operator overloading in C++ permits its semi-automatic implementation over templated code.

Please see the intro attached for details.

I have been working with AAD (in finance) for 5 years. I have been developing and using professional AAD libraries, I have given dedicated talks and workshops at professional conferences such as Global Derivatives and WBS, and I have been teaching AAD at the MsC in Mathematical Finance at the Copenhagen University in Denmark (Mathematics Department). Our implementation in Danske Bank obtained in 2015 the In-House System of the Year 2015 Risk award.

I am planning a book on AAD. And, although that publication is biased towards finance, I have developed an independent, simple and efficient AAD engine that can be found here:

The implementation uses the latest developments in AAD technology for maximum performance, in particular expression templates.

I thought that given the wide interest for AAD, a Boost.Aad module could make sense and I was hoping that it could be based on my developments.

Please could somebody let me know if there is interest, and, if so, how I should proceed?

Million thanks.

Kind regards,

Antoine Savine

Boost list run by bdawes at, gregod at, cpdaniel at, john at