derivative

Getting gradient of vectorized function in pytorch

久未见 提交于 2021-02-20 00:46:41
问题 I am brand new to PyTorch and want to do what I assume is a very simple thing but am having a lot of difficulty. I have the function sin(x) * cos(x) + x^2 and I want to get the derivative of that function at any point. If I do this with one point it works perfectly as x = torch.autograd.Variable(torch.Tensor([4]),requires_grad=True) y = torch.sin(x)*torch.cos(x)+torch.pow(x,2) y.backward() print(x.grad) # outputs tensor([7.8545]) However, I want to be able to pass in a vector as x and for it

Recursively find the second derivative

佐手、 提交于 2021-02-19 23:37:10
问题 I'm attempting to write a program in C which finds the numeric value of the second derivative of f(x) where x is given. When I use my function to find the first derivative, everything seems to be working okay, but the recursive second derivative part always gives me 0.0. Here's my function: double derivative(double (*f)(double), double x0, int order) { if(order == 1){ const double delta = 1.0e-6; double x1 = x0-delta; double x2 = x0+delta; double y1 = f(x1); double y2 = f(x2); return (y2 - y1

Checking the gradient when doing gradient descent

允我心安 提交于 2021-02-19 08:22:32
问题 I'm trying to implement a feed-forward backpropagating autoencoder (training with gradient descent) and wanted to verify that I'm calculating the gradient correctly. This tutorial suggests calculating the derivative of each parameter one at a time: grad_i(theta) = (J(theta_i+epsilon) - J(theta_i-epsilon)) / (2*epsilon) . I've written a sample piece of code in Matlab to do just this, but without much luck -- the differences between the gradient calculated from the derivative and the gradient

Checking the gradient when doing gradient descent

血红的双手。 提交于 2021-02-19 08:22:12
问题 I'm trying to implement a feed-forward backpropagating autoencoder (training with gradient descent) and wanted to verify that I'm calculating the gradient correctly. This tutorial suggests calculating the derivative of each parameter one at a time: grad_i(theta) = (J(theta_i+epsilon) - J(theta_i-epsilon)) / (2*epsilon) . I've written a sample piece of code in Matlab to do just this, but without much luck -- the differences between the gradient calculated from the derivative and the gradient

How do I get SymPy to collect partial derivatives?

前提是你 提交于 2021-01-27 19:59:03
问题 I have been using SymPy to expand the terms of a complex partial differential equation and would like to use the collect function to gather terms. However, it seems to have a problem dealing with second (or higher order) derivatives where the variables of differentiation differ. In the code example below collect(expr6... works, but collect(expr7 ... does not, returning the error message "NotImplementedError: Improve MV Derivative support in collect" . The error is clearly related to the psi