this should be a quick one. When I use a pre-defined module in PyTorch, I can typically access its weights fairly easily. However, how do I access them if I wrapped the modu
You can access modules by name using _modules:
_modules
class Net(nn.Module): def __init__(self): super(Net, self).__init__() self.conv1 = nn.Conv2d(3, 3, 3) def forward(self, input): return self.conv1(input) model = Net() print(model._modules['conv1'])