Matlab gradient and hessian computation for symbolic vector function

ぃ、小莉子 提交于 2019-12-13 02:19:37

问题


I am trying to use the Matlab "gradient" and "hessian" functions to calculate the derivative of a symbolic vector function with respect to a vector. Below is an example using the sigmoid function 1/(1+e^(-a)) where a is a feature vector multiplied by weights. The versions below all return an error. I am new to Matlab and would very much appreciate any advice. The solution may well be under my nose in the documentation, but I have not been able to solve the issue. Thank you in advance for your help!

%version 1
syms y w x
x = sym('x', [1 3]);
w = sym('w', [1 3]);
f = (y-1)*w.*x + log(1/(1+exp(-w.*x)));
gradient(f, w)

%version 2
syms y w1 w2 w3 x1 x2 x3 x w
x = [x1,x2,x3];
w = [w1,w2,w3];
f = (y-1)*w.*x + log(1/(1+exp(-w.*x)));

%version 3
syms y w1 w2 w3 x1 x2 x3
f = (y-1)*[w1,w2,w3].*[x1,x2,x3] + log(1/(1+exp(-[w1,w2,w3].*[x1,x2,x3])));

回答1:


Thanks, Daniel - it turns out the problem was in not having used dot() to take the dot product of w and x. Both diff() and gradient() gave the same solution, shown below:

syms y
x = sym('x', [1 3]);
w = sym('w', [1 3]);
f = (y-1)*dot(w,x) + log(1/(1+exp(dot(-w,x))));
diff(f, w(1))
gradient(f, w(1))

%ans =
%x1*(y - 1) + (x1*exp(- x1*conj(w1) - x2*conj(w2) - x3*conj(w3)))/
    (exp(-x1*conj(w1) - x2*conj(w2) - x3*conj(w3)) + 1)


来源:https://stackoverflow.com/questions/29591277/matlab-gradient-and-hessian-computation-for-symbolic-vector-function

易学教程内所有资源均来自网络或用户发布的内容,如有违反法律规定的内容欢迎反馈
该文章没有解决你所遇到的问题?点击提问,说说你的问题,让更多的人一起探讨吧!