I\'m writing some basic neural network methods - specifically the activation functions - and have hit the limits of my rubbish knowledge of math. I understand the respective ran
Logistic function: ex/(ex + ec)
Special ("standard") case of the logistic function: 1/(1 + e-x)
Bipolar sigmoid: never heard of it.
Tanh: (ex-e-x)/(ex + e-x)
Sigmoid usually refers to the shape (and limits), so yes, tanh is a sigmoid function. But in some contexts it refers specifically to the standard logistic function, so you have to be careful. And yes, you could use any sigmoid function and probably do just fine.
(( 2/ (1 + Exp(-2 * x))) - 1) is equivalent to tanh(x).