Partial Derivative using Autograd

天大地大妈咪最大 提交于 2020-01-01 06:55:07

问题


I have a function that takes in a multivariate argument x. Here x = [x1,x2,x3]. Let's say my function looks like: f(x,T) = np.dot(x,T) + np.exp(np.dot(x,T) where T is a constant.

I am interested in finding df/dx1, df/dx2 and df/dx3 functions.

I have achieved some success using scipy diff, but I am a bit skeptical because it uses numerical differences. Yesterday, my colleague pointed me to Autograd (github). Since it seems to be a popular package, I am hoping someone here knows how to get partial differentiation using this package. My initial tests with this library indicates that the grad function only takes differentiation with respect to the first argument. I am not sure how to extend it to other arguments. Any help would be greatly appreciated.

Thanks.


回答1:


I found the following description of the grad function in the autograd source code:

def grad(fun, x)
"Returns a function which computes the gradient of `fun` with
respect to positional argument number `argnum`. The returned
function takes the same arguments as `fun`, but returns the
gradient instead. The function `fun`should be scalar-valued. The
gradient has the same type as the argument."

So

def h(x,t):
    return np.dot(x,t) + np.exp(np.dot(x,t))
h_x = grad(h,0) # derivative with respect to x
h_t = grad(h,1) # derivative with respect to t

Also make sure to use the numpy libaray that comes with autograd

import autograd.numpy as np

instead of

import numpy as np

in order to make use of all numpy functions.



来源:https://stackoverflow.com/questions/45599524/partial-derivative-using-autograd

易学教程内所有资源均来自网络或用户发布的内容,如有违反法律规定的内容欢迎反馈
该文章没有解决你所遇到的问题?点击提问,说说你的问题,让更多的人一起探讨吧!