|
Boost Users : |
From: Amit Gupta (gupta839_at_[hidden])
Date: 2022-01-25 18:31:52
I am currently exploring boost as autodiff framework for my
application. I would like to calculate the derivative of a
vector->scalar function. However I could not find a way to evaluate
the same, rather I always get total gradient.
Example: How to get the following program yield output equivalent to JAX fwd?
#include <boost/math/differentiation/autodiff.hpp>
#include <iostream>
#include<vector>
template <typename T>
T power_sum(double p, std::vector<T> &z){
T y=0.0;
for (int i=0; i<6;i++){
y += pow(z[i], p);
}
return y;
}
int main() {
using namespace boost::math::differentiation;
constexpr unsigned Order = 2;
std::vector<double> z_v = {1.,2.,3.,4.,5.,6.};
std::vector<autodiff_fvar<double, Order> > z;
for (int i=0; i<6;i++){
z.push_back(make_fvar<double, Order>(z_v[i]));
}
auto y = power_sum(2.0,z);
std::cout<<y<<"\n";
std::cout<<y.derivative(1)<<"\n";
return 0;
}
JAX output:
import jax as jx
import jax.numpy as jnp
x = jnp.array([1.0, 2.0, 3.0, 4.0, 5.0, 6.0])
def power_sum(x, p):
y = x[0] ** p + x[1] ** p + x[2] ** p + x[3] ** p + x[4] ** p + x[5] ** p
return y
f_prime = jx.jacfwd(power_sum)(x,2.0)
f_prime
# DeviceArray([ 2., 4., 6., 8., 10., 12.], dtype=float32)
I understand I can do it by send array 6 times with all other vvector
values being zero. Is there a better way?
Boost-users list run by williamkempf at hotmail.com, kalb at libertysoft.com, bjorn.karlsson at readsoft.com, gregod at cs.rpi.edu, wekempf at cox.net