I am using Pytorch and a pretrained model from the transformers library. However, while finetuning it runs out of GPU memory very quickly and I wonder why.
I\'ve foun