What does numpy.gradient do?

后端 未结 4 1530
旧巷少年郎
旧巷少年郎 2020-12-23 02:15

So I know what the gradient of a (mathematical) function is, so I feel like I should know what numpy.gradient does. But I don\'t. The documentation is not reall

相关标签:
4条回答
  • 2020-12-23 02:31

    Also in the documentation1:

    >>> y = np.array([1, 2, 4, 7, 11, 16], dtype=np.float)
    >>> j = np.gradient(y)
    >>> j 
    array([ 1. ,  1.5,  2.5,  3.5,  4.5,  5. ])
    
    • Gradient is defined as (change in y)/(change in x).
    • x, here, is the index, so the difference between adjacent values is 1.

    • At the boundaries, the first difference is calculated. This means that at each end of the array, the gradient given is simply, the difference between the end two values (divided by 1)

    • Away from the boundaries the gradient for a particular index is given by taking the difference between the the values either side and dividing by 2.

    So, the gradient of y, above, is calculated thus:

    j[0] = (y[1]-y[0])/1 = (2-1)/1  = 1
    j[1] = (y[2]-y[0])/2 = (4-1)/2  = 1.5
    j[2] = (y[3]-y[1])/2 = (7-2)/2  = 2.5
    j[3] = (y[4]-y[2])/2 = (11-4)/2 = 3.5
    j[4] = (y[5]-y[3])/2 = (16-7)/2 = 4.5
    j[5] = (y[5]-y[4])/1 = (16-11)/1 = 5
    

    You could find the minima of all the absolute values in the resulting array to find the turning points of a curve, for example.


    1The array is actually called x in the example in the docs, I've changed it to y to avoid confusion.

    0 讨论(0)
  • 2020-12-23 02:46

    The gradient is computed using central differences in the interior and first differences at the boundaries.

    and

    The default distance is 1

    This means that in the interior it is computed as

    enter image description here

    where h = 1.0

    and at the boundaries

    enter image description here

    0 讨论(0)
  • 2020-12-23 02:47

    Here is what is going on. The Taylor series expansion guides us on how to approximate the derivative, given the value at close points. The simplest comes from the first order Taylor series expansion for a C^2 function (two continuous derivatives)...

    • f(x+h) = f(x) + f'(x)h+f''(xi)h^2/2.

    One can solve for f'(x)...

    • f'(x) = [f(x+h) - f(x)]/h + O(h).

    Can we do better? Yes indeed. If we assume C^3, then the Taylor expansion is

    • f(x+h) = f(x) + f'(x)h + f''(x)h^2/2 + f'''(xi) h^3/6, and
    • f(x-h) = f(x) - f'(x)h + f''(x)h^2/2 - f'''(xi) h^3/6.

    Subtracting these (both the h^0 and h^2 terms drop out!) and solve for f'(x):

    • f'(x) = [f(x+h) - f(x-h)]/(2h) + O(h^2).

    So, if we have a discretized function defined on equal distant partitions: x = x_0,x_0+h(=x_1),....,x_n=x_0+h*n, then numpy gradient will yield a "derivative" array using the first order estimate on the ends and the better estimates in the middle.

    Example 1. If you don't specify any spacing, the interval is assumed to be 1. so if you call

    f = np.array([5, 7, 4, 8])
    

    what you are saying is that f(0) = 5, f(1) = 7, f(2) = 4, and f(3) = 8. Then

    np.gradient(f) 
    

    will be: f'(0) = (7 - 5)/1 = 2, f'(1) = (4 - 5)/(2*1) = -0.5, f'(2) = (8 - 7)/(2*1) = 0.5, f'(3) = (8 - 4)/1 = 4.

    Example 2. If you specify a single spacing, the spacing is uniform but not 1.

    For example, if you call

    np.gradient(f, 0.5)
    

    this is saying that h = 0.5, not 1, i.e., the function is really f(0) = 5, f(0.5) = 7, f(1.0) = 4, f(1.5) = 8. The net effect is to replace h = 1 with h = 0.5 and all the results will be doubled.

    Example 3. Suppose the discretized function f(x) is not defined on uniformly spaced intervals, for instance f(0) = 5, f(1) = 7, f(3) = 4, f(3.5) = 8, then there is a messier discretized differentiation function that the numpy gradient function uses and you will get the discretized derivatives by calling

    np.gradient(f, np.array([0,1,3,3.5]))
    

    Lastly, if your input is a 2d array, then you are thinking of a function f of x, y defined on a grid. The numpy gradient will output the arrays of "discretized" partial derivatives in x and y.

    0 讨论(0)
  • 2020-12-23 02:47

    Think about N-dimensional array as a matrix. Then gradient is nothing else as matrix differentiation

    For a good explanation look at gradient description in matlab documentation.

    0 讨论(0)
提交回复
热议问题