How to implement the derivative of Leaky Relu in python?

半腔热情 提交于 2021-01-27 04:50:27

问题


How would I implement the derivative of Leaky ReLU in Python without using Tensorflow?

Is there a better way than this? I want the function to return a numpy array

def dlrelu(x, alpha=.01):
     # return alpha if x < 0 else 1

     return np.array ([1 if i >= 0 else alpha for i in x])

Thanks in advance for the help


回答1:


The method you use works, but strictly speaking you are computing the derivative with respect to the loss, or lower layer, so it might be wise to also pass the value from lower layer to compute the derivative (dl/dx).

Anyway, you can avoid using the loop which is more efficient for large x. This is one way to do it:

def dlrelu(x, alpha=0.01):
  dx = np.ones_like(x)
  dx[x < 0] = alpha
  return dx

If you passed the error from lower layer, it looks like this:

def dlrelu(dl, x, alpha=0.01):
  """ dl and x have same shape. """
  dx = np.ones_like(x)
  dx[x < 0] = alpha
  return dx*dl


来源:https://stackoverflow.com/questions/48102882/how-to-implement-the-derivative-of-leaky-relu-in-python

易学教程内所有资源均来自网络或用户发布的内容,如有违反法律规定的内容欢迎反馈
该文章没有解决你所遇到的问题?点击提问,说说你的问题,让更多的人一起探讨吧!