I\'ve been profiling an application all day long and, having optimized a couple bits of code, I\'m left with this on my todo list. It\'s the activation function for a neural
There are a much faster functions that do very similar things:
x / (1 + abs(x)) – fast replacement for TAHN
x / (1 + abs(x))
And similarly:
x / (2 + 2 * abs(x)) + 0.5 - fast replacement for SIGMOID
x / (2 + 2 * abs(x)) + 0.5
Compare plots with actual sigmoid