ValueError: optimizer got an empty parameter list

强颜欢笑 提交于 2020-11-29 19:03:44

问题


I create the following simple linear class:

class Decoder(nn.Module):
    def __init__(self, K, h=()):
        super().__init__()
        h = (K,)+h+(K,)
        self.layers = [nn.Linear(h1,h2) for h1,h2 in zip(h, h[1:])]

    def forward(self, x):
        for layer in self.layers[:-1]:
            x = F.relu(layer(x))
        return self.layers[-1](x)

However, when I try to put the parameters in a optimizer class I get the error ValueError: optimizer got an empty parameter list.

decoder = Decoder(4)
LR = 1e-3
opt = optim.Adam(decoder.parameters(), lr=LR)

Is there something I'm doing obviously wrong with the class definition?


回答1:


Since you store your layers in a regular pythonic list inside your Decoder, Pytorch has no way of telling these members of the self.list are actually sub modules. Conver this list into pytorch's nn.ModuleList and your problem will be solved

class Decoder(nn.Module):
    def __init__(self, K, h=()):
        super().__init__()
        h = (K,)+h+(K,)
        self.layers = nn.ModuleList(nn.Linear(h1,h2) for h1,h2 in zip(h, h[1:]))


来源:https://stackoverflow.com/questions/57320958/valueerror-optimizer-got-an-empty-parameter-list

标签
易学教程内所有资源均来自网络或用户发布的内容,如有违反法律规定的内容欢迎反馈
该文章没有解决你所遇到的问题?点击提问,说说你的问题,让更多的人一起探讨吧!