Weight Initialisation
问题 I plan to use the Nguyen-Widrow Algorithm for an NN with multiple hidden layers . While researching, I found a lot of ambiguities and I wish to clarify them. The following is pseudo code for the Nguyen-Widrow Algorithm Initialize all weight of hidden layers with random values For each hidden layer{ beta = 0.7 * Math.pow(hiddenNeurons, 1.0 / number of inputs); For each synapse{ For each weight{ Adjust weight by dividing by norm of weight for neuron and * multiplying by beta value } } } Just