Module\'s parameters get changed during training, that is, they are what is learnt during training of a neural network, but what is a buffer?
and is it learnt during
Both parameters and buffers you create for a module (nn.Module).
Say you have a linear layer nn.Linear. You already have weight and bias parameters. But if you need a new parameter you use register_parameter() to register a new named parameter that is a tensor.
When you register a new parameter it will appear inside module.parameters() iterator, but when you register a buffer it will not.
The difference:
Buffers are named tensors that do not update gradients at every step, like parameters. For buffers, you create your custom logic (fully up to you).
The good thing is when you save the model, all params and buffers are saved, and when you move the model to or off the CUDA params and buffers will go as well.