问题
How to count the total number of parameters in a PyTorch model? Something similar to model.count_params() in Keras.
回答1:
PyTorch doesn't have a function to calculate the total number of parameters as Keras does, but it's possible to sum the number of elements for every parameter group:
pytorch_total_params = sum(p.numel() for p in model.parameters())
If you want to calculate only the trainable parameters:
pytorch_total_params = sum(p.numel() for p in model.parameters() if p.requires_grad)
Answer inspired by this answer on PyTorch Forums.
Note: I'm answering my own question. If anyone has a better solution, please share with us.
回答2:
If you want to calculate the number of weights and biases in each layer without instantiating the model, you can simply load the raw file and iterate over the resulting collections.OrderedDict like so:
import torch
tensor_dict = torch.load('model.dat', map_location='cpu') # OrderedDict
tensor_list = list(tensor_dict.items())
for layer_tensor_name, tensor in tensor_list:
print('Layer {}: {} elements'.format(layer_tensor_name, torch.numel(tensor)))
You'll get something like
conv1.weight: 312
conv1.bias: 26
batch_norm1.weight: 26
batch_norm1.bias: 26
batch_norm1.running_mean: 26
batch_norm1.running_var: 26
conv2.weight: 2340
conv2.bias: 10
batch_norm2.weight: 10
batch_norm2.bias: 10
batch_norm2.running_mean: 10
batch_norm2.running_var: 10
fcs.layers.0.weight: 135200
fcs.layers.0.bias: 260
fcs.layers.1.weight: 33800
fcs.layers.1.bias: 130
fcs.batch_norm_layers.0.weight: 260
fcs.batch_norm_layers.0.bias: 260
fcs.batch_norm_layers.0.running_mean: 260
fcs.batch_norm_layers.0.running_var: 260
回答3:
Another possible solution with respect
def model_summary(model):
print("model_summary")
print()
print("Layer_name"+"\t"*7+"Number of Parameters")
print("="*100)
model_parameters = [layer for layer in model.parameters() if layer.requires_grad]
layer_name = [child for child in model.children()]
j = 0
total_params = 0
print("\t"*10)
for i in layer_name:
print()
param = 0
try:
bias = (i.bias is not None)
except:
bias = False
if not bias:
param =model_parameters[j].numel()+model_parameters[j+1].numel()
j = j+2
else:
param =model_parameters[j].numel()
j = j+1
print(str(i)+"\t"*3+str(param))
total_params+=param
print("="*100)
print(f"Total Params:{total_params}")
model_summary(net)
This would give output similar to below
model_summary
Layer_name Number of Parameters
====================================================================================================
Conv2d(1, 6, kernel_size=(3, 3), stride=(1, 1)) 60
Conv2d(6, 16, kernel_size=(3, 3), stride=(1, 1)) 880
Linear(in_features=576, out_features=120, bias=True) 69240
Linear(in_features=120, out_features=84, bias=True) 10164
Linear(in_features=84, out_features=10, bias=True) 850
====================================================================================================
Total Params:81194
来源:https://stackoverflow.com/questions/49201236/check-the-total-number-of-parameters-in-a-pytorch-model