derivative

Derivative of a conjugate in sympy

青春壹個敷衍的年華 提交于 2019-12-06 10:22:18
When I try to differentiate a symbol with SymPy I get the following In : x=Symbol('x') In : diff(x,x) Out: 1 When I differentiate the symbol respect to its conjugate the result is In [55]: diff(x,x.conjugate()) Out[55]: 0 However, when I try to differentiate the conjugate of the symbol SymPy doesn't do it In : diff(x.conjugate(),x) Out: Derivative(conjugate(x), x) This is still correct, but the result should be zero. How can I make SimPy perform the derivative of a conjugate? I'm not sure about the mathematics if diff(conjugate(x), x) should be zero. The fact that diff(x,x.conjugate()) gives

R Hessian Matrix

▼魔方 西西 提交于 2019-12-06 07:54:59
I need to create the Hessian matrix of a function given as: func <- expression(sin(x+y)+cos(x-y)) vars <- c("x", "y") I need the second order derivatives as expressions too, and I need to evaluate them lot of times, so I made a list of first order derivatives, and a list of list of second order derivatives. funcD <- lapply(vars, function(v) D(func, v)) funcDD <- list(); for (i in 1:length(vars)) funcDD[[i]] <- lapply(vars, function(v) D(funcD[[i]], v)) So far, it works. > funcDD [[1]] [[1]][[1]] -(sin(x + y) + cos(x - y)) [[1]][[2]] -(sin(x + y) - cos(x - y)) [[2]] [[2]][[1]] cos(x - y) - sin

Explicit formula versus symbolic derivatives in R

走远了吗. 提交于 2019-12-05 21:15:09
I would like to evaluate higher order derivatives of some function f in R . Two possibilities are available to me. Either I determine a general expression for f (k) , the k -th derivative of f (which I can do in my particular case), and then I evaluate it; Or I take advantage of the capacities of R to perform symbolic derivative (function D() ). What are the advantages of 1 over 2? Let us say that f (k) is not a recursive formula. What if f (k) is recursive? Any hint will be appreciated. G. Grothendieck Symbolic differentiation is less error prone than doing it by hand. For low orders I would

How do you evaluate a derivative in python?

匆匆过客 提交于 2019-12-05 11:18:59
I'm a beginner in python. I've recently learned about Sympy and its symbolic manipulation capabilities, in particular, differentiation. I am trying to do the following in the easiest way possible: Define f(x,y) = x^2 + xy^2. Differentiate f with respect to x. So f'(x,y) = 2x + xy^2. Evaluate the derivative, e.g., f'(1,1) = 2 + 1 = 3. I know how to do 1 and 2. The problem is, when I try to evaluate the derivative in step 3, I get an error that python can't calculate the derivative. Here is a minimal working example: import sympy as sym import math def f(x,y): return x**2 + x*y**2 x, y = sym

How is the gradient and hessian of logarithmic loss computed in the custom objective function example script in xgboost's github repository?

北慕城南 提交于 2019-12-05 07:53:45
I would like to understand how the gradient and hessian of the logloss function are computed in an xgboost sample script . I've simplified the function to take numpy arrays, and generated y_hat and y_true which are a sample of the values used in the script. Here is a simplified example: import numpy as np def loglikelihoodloss(y_hat, y_true): prob = 1.0 / (1.0 + np.exp(-y_hat)) grad = prob - y_true hess = prob * (1.0 - prob) return grad, hess y_hat = np.array([1.80087972, -1.82414818, -1.82414818, 1.80087972, -2.08465433, -1.82414818, -1.82414818, 1.80087972, -1.82414818, -1.82414818]) y_true

C# - Finding Peaks within a Given Width via Quadratic Fit

核能气质少年 提交于 2019-12-04 20:16:40
I'm working on an algorithm to find peaks in a List object. I'd thought up what I thought was a good (or good enough) algorithm for doing this by looking at a point and it's neighbors and, if it was a peak, adding it to the results list. However, given some recent results, I don't think this method works as well as I'd initially hoped. (I've included the code I'm currently using, and hope to replace, below). I've done a little work with LabView before and I know that the way their module finds peaks/valleys works for what I need to do. I did some research into how LabView does this and found

Why are dFdx/ddx and dFdy/ddy 2 dimension variables when quering a 2d texture?

こ雲淡風輕ζ 提交于 2019-12-04 13:15:46
问题 I cannot seem to understand this, shouldn't the derivative/change along the U or V coordinate in a 2d texture/array be single dimension variable as we are checking it only along ddx (U coordinate) or ddy (V coordinate)? 回答1: There are 4 distinct partial derivatives here: du/dx, dv/dx, du/dy, and dv/dy. None of those four values need be zero, unless the texture image coordinates happen to be perfectly aligned to the display screen axes. In general the texture coordinate axes need not be

Implement Relu derivative in python numpy

一世执手 提交于 2019-12-04 09:32:27
问题 This question was migrated from Cross Validated because it can be answered on Stack Overflow. Migrated 2 years ago . I'm trying to implement a function that computes the Relu derivative for each element in a matrix, and then return the result in a matrix. I'm using Python and Numpy. Based on other Cross Validation posts, the Relu derivative for x is 1 when x > 0, 0 when x < 0, undefined or 0 when x == 0 Currently, I have the following code so far: def reluDerivative(self, x): return np.array(

Keras: calculating derivatives of model output wrt input returns [None]

我与影子孤独终老i 提交于 2019-12-03 20:58:34
I need help with calculating derivatives for model output wrt inputs in Keras. I want to add a regularization functional to the loss function. The regularizer contains the derivative of the classifier function. So I tried to take the derivative of model output. The model is a MLP with one hidden layer. The dataset is MNIST. When I compile the model and take the derivative, I get [None] as the result instead of the derivative function. I have seen a similar post, but didn't get answer there either: Taking derivative of Keras model wrt to inputs is returning all zeros Here is my code. Please

Calculate the derivative of a data-function in r

前提是你 提交于 2019-12-03 16:24:52
Is there an easy way to calculate the derivative of non-liner functions that are give by data? for example: x = 1 / c(1000:1) y = x^-1.5 ycs = cumsum(y) plot (x, ycs, log="xy") How can I calculate the derivative function from the function given by ´x´ and ´ycs´? Was also going to suggest an example of a smoothed spline fit followed by prediction of the derivative. In this case, the results are very similar to the diff calculation described by @dbaupp: spl <- smooth.spline(x, y=ycs) pred <- predict(spl) plot (x, ycs, log="xy") lines(pred, col=2) ycs.prime <- diff(ycs)/diff(x) pred.prime <-