I\'m trying to train a model in Pytorch, and I\'d like to have a batch size of 8, but due to memory limitations, I can only have a batch size of at most 4. I\'ve looked all