Implementation of a softmax activation function for neural networks

前端 未结 2 1722
失恋的感觉
失恋的感觉 2020-12-23 15:13

I am using a Softmax activation function in the last layer of a neural network. But I have problems with a safe implementation of this function.

A naive implementati

2条回答
  •  抹茶落季
    2020-12-23 15:40

    First go to log scale, i.e calculate log(y) instead of y. The log of the numerator is trivial. In order to calculate the log of the denominator, you can use the following 'trick': http://lingpipe-blog.com/2009/06/25/log-sum-of-exponentials/

提交回复
热议问题