Neural Network Initialization - Nguyen Widrow Implementation?

妖精的绣舞 提交于 2020-01-02 07:43:15

问题


I've had a go at implementing the Nguyen Widrow algorithm (below) and it appears to function correctly, but I have some follow-on questions:

  • Does this look like a correct implementation?

  • Does Nguyen Widrow initialization apply to any network topology / size ? (ie 5 layer AutoEncoder)

  • Is Nguyen Widrow initialization valid for any input range? (0/1, -1/+1, etc)

  • Is Nguyen Widrow initialization valid for any activation function? (Ie Logistic, Tanh, Linear)

The code below assumes that the network has already been randomized to -1/+1 :

        ' Calculate the number of hidden neurons
        Dim HiddenNeuronsCount As Integer = Me.TotalNeuronsCount - (Me.InputsCount - Me.OutputsCount)

        ' Calculate the Beta value for all hidden layers
        Dim Beta As Double = (0.7 * Math.Pow(HiddenNeuronsCount, (1.0 / Me.InputsCount)))

        ' Loop through each layer in neural network, skipping input layer
        For i As Integer = 1 To Layers.GetUpperBound(0)

            ' Loop through each neuron in layer
            For j As Integer = 0 To Layers(i).Neurons.GetUpperBound(0)

                Dim InputsNorm As Double = 0

                ' Loop through each weight in neuron inputs, add weight value to InputsNorm
                For k As Integer = 0 To Layers(i).Neurons(j).ConnectionWeights.GetUpperBound(0)
                    InputsNorm += Layers(i).Neurons(j).ConnectionWeights(k) * Layers(i).Neurons(j).ConnectionWeights(k)
                Next

                ' Add bias value to InputsNorm
                InputsNorm += Layers(i).Neurons(j).Bias * Layers(i).Neurons(j).Bias

                ' Finalize euclidean norm calculation
                InputsNorm = Math.Sqrt(InputsNorm)

                ' Loop through each weight in neuron inputs, scale the weight based on euclidean norm and beta
                For k As Integer = 0 To Layers(i).Neurons(j).ConnectionWeights.GetUpperBound(0)
                    Layers(i).Neurons(j).ConnectionWeights(k) = (Beta * Layers(i).Neurons(j).ConnectionWeights(k)) / InputsNorm
                Next

                ' Scale the bias based on euclidean norm and beta
                Layers(i).Neurons(j).Bias = (Beta * Layers(i).Neurons(j).Bias) / InputsNorm

            Next

        Next

回答1:


Nguyen & Widrow in their paper assume that the inputs are between -1 and +1. Nguyen Widrow initialization is valid for any activation function which is finite in length. Again in their paper they are only talking about a 2 layer NN, not sure about a 5 layer one.

S



来源:https://stackoverflow.com/questions/11868337/neural-network-initialization-nguyen-widrow-implementation

易学教程内所有资源均来自网络或用户发布的内容,如有违反法律规定的内容欢迎反馈
该文章没有解决你所遇到的问题?点击提问,说说你的问题,让更多的人一起探讨吧!