Can't init the weights of my neural network PyTorch

牧云@^-^@ 提交于 2021-01-29 08:27:29

问题


I can't initialize the weights with the function MyNet.apply(init_weights).

These are my functions:

def init_weights(net):
    if type(net) == torch.nn.Module:
        torch.nn.init.kaiming_uniform_(net.weight)
        net.bias.data.fill_(0.01)  # tots els bias a 0.01

My neural net is the following:

class NeuralNet(torch.nn.Module):
    def __init__(self):
        super().__init__() # Necessary for torch to detect this class as trainable
        # Here define network architecture
        self.layer1 = torch.nn.Linear(28**2, 32).to(device) # Linear layer with 32 neurons
        self.layer2 = torch.nn.Linear(32, 64).to(device) # Linear layer with 64 neurons
        self.layer3 = torch.nn.Linear(64, 128).to(device)  # Linear layer with 128 neurons
        self.output = torch.nn.Linear(128, 1).to(device) # Linear layer with 1 output neuron (binary output)




    def forward(self, x):
        # Here define architecture behavior
        x = torch.sigmoid(self.layer1(x)).to(device) # x = torch.nn.functional.relu(self.layer1(x))
        x = torch.sigmoid(self.layer2(x)).to(device)  
        x = torch.sigmoid(self.layer3(x)).to(device)

        return torch.sigmoid(self.output(x)).to(device) # Binary output


The type(net) prints as linear so it never gets inside the if statement, and if I remove it produces the following error:

AttributeError: 'NeuralNet' object has no attribute 'weight'


回答1:


You should init only the weight of the linear layers:

def init_weights(net):
    if type(net) == torch.nn.Linear:
        torch.nn.init.kaiming_uniform_(net.weight)
        net.bias.data.fill_(0.01)  # tots els bias a 0.01


来源:https://stackoverflow.com/questions/59156629/cant-init-the-weights-of-my-neural-network-pytorch

易学教程内所有资源均来自网络或用户发布的内容,如有违反法律规定的内容欢迎反馈
该文章没有解决你所遇到的问题?点击提问,说说你的问题,让更多的人一起探讨吧!