Why is the Pytorch Dropout layer affecting all values, not only the ones set to zero?
问题 The dropout layer from Pytorch changes the values that are not set to zero. Using Pytorch's documentation example: (source): import torch import torch.nn as nn m = nn.Dropout(p=0.5) input = torch.ones(5, 5) print(input) tensor([[1., 1., 1., 1., 1.], [1., 1., 1., 1., 1.], [1., 1., 1., 1., 1.], [1., 1., 1., 1., 1.], [1., 1., 1., 1., 1.]]) Then I pass it through a dropout layer: output = m(input) print(output) tensor([[0., 0., 2., 2., 0.], [2., 0., 2., 0., 0.], [0., 0., 0., 0., 2.], [2., 2., 2.,