Neural Network Always Produces Same/Similar Outputs for Any Input

前端 未结 11 1938
猫巷女王i
猫巷女王i 2020-12-13 04:24

I have a problem where I am trying to create a neural network for Tic-Tac-Toe. However, for some reason, training the neural network causes it to produce nearly the same out

11条回答
  •  暖寄归人
    2020-12-13 04:33

    I faced a similar issue earlier when my data was not properly normalized. Once I normalized the data everything ran correctly.

    Recently, I faced this issue again and after debugging, I found that there can be another reason for neural networks giving the same output. If you have a neural network that has a weight decay term such as that in the RSNNS package, make sure that your decay term is not so large that all weights go to essentially 0.

    I was using the caret package for in R. Initially, I was using a decay hyperparameter = 0.01. When I looked at the diagnostics, I saw that the RMSE was being calculated for each fold (of cross validation), but the Rsquared was always NA. In this case all predictions were coming out to the same value.

    Once I reduced the decay to a much lower value (1E-5 and lower), I got the expected results.

    I hope this helps.

提交回复
热议问题