deep_learning_Activate_method
常见的激活函数有sigmoid、tanh和relu三种非线性函数,其数学表达式分别为: sigmoid: y = 1/(1 + e -x ) tanh: y = (e x - e -x )/(e x + e -x ) relu: y = max(0, x) 其代码实现如下: import numpy as np import matplotlib.pyplot as plt def sigmoid(x): return 1 / (1 + np.exp(-x)) def tanh(x): return (np.exp(x) - np.exp(-x)) / (np.exp(x) + np.exp(-x)) def relu(x): return np.maximum(0, x) x = np.arange(-5, 5, 0.1) p1 = plt.subplot(311) y = tanh(x) p1.plot(x, y) p1.set_title('tanh') p1.axhline(ls='--', color='r') p1.axvline(ls='--', color='r') p2 = plt.subplot(312) y = sigmoid(x) p2.plot(x, y) p2.set_title('sigmoid') p2.axhline(0.5, ls='--',