Neural Network Always Produces Same/Similar Outputs for Any Input

前端 未结 11 1945
猫巷女王i
猫巷女王i 2020-12-13 04:24

I have a problem where I am trying to create a neural network for Tic-Tac-Toe. However, for some reason, training the neural network causes it to produce nearly the same out

11条回答
  •  一个人的身影
    2020-12-13 04:40

    For me it was happening exactly like in your case, the output of neural network was always the same no matter the training & number of layers etc.

    Turns out my back-propagation algorithm had a problem. At one place I was multiplying by -1 where it wasn't required.

    There could be another problem like this. The question is how to debug it?

    Steps to debug:

    Step1 : Write the algorithm such that it can take variable number of input layers and variable number of input & output nodes.
    Step2 : Reduce the hidden layers to 0. Reduce input to 2 nodes, output to 1 node.
    Step3 : Now train for binary-OR-Operation.
    Step4 : If it converges correctly, go to Step 8.
    Step5 : If it doesn't converge, train it only for 1 training sample
    Step6 : Print all the forward and prognostication variables (weights, node-outputs, deltas etc)
    Step7 : Take pen&paper and calculate all the variables manually.
    Step8 : Cross verify the values with algorithm.
    Step9 : If you don't find any problem with 0 hidden layers. Increase hidden layer size to 1. Repeat step 5,6,7,8
    

    It sounds like a lot of work, but it works very well IMHO.

提交回复
热议问题