Math optimization in C#

后端 未结 25 2298
悲&欢浪女
悲&欢浪女 2020-12-07 10:25

I\'ve been profiling an application all day long and, having optimized a couple bits of code, I\'m left with this on my todo list. It\'s the activation function for a neural

25条回答
  •  醉梦人生
    2020-12-07 11:14

    1. Remember, that any changes in this activation function come at cost of different behavior. This even includes switching to float (and thus lowering the precision) or using activation substitutes. Only experimenting with your use case will show the right way.
    2. In addition to the simple code optimizations, I would also recommend to consider parallelization of the computations (i.e.: to leverage multiple cores of your machine or even machines at the Windows Azure Clouds) and improving the training algorithms.

    UPDATE: Post on lookup tables for ANN activation functions

    UPDATE2: I removed the point on LUTs since I've confused these with the complete hashing. Thanks go to Henrik Gustafsson for putting me back on the track. So the memory is not an issue, although the search space still gets messed up with local extrema a bit.

提交回复
热议问题