pytorch 问题记录

匿名 (未验证) 提交于 2019-12-03 00:19:01

Tensor & Variable

What is the difference between Tensors and Variables in Pytorch?

  • torch tensors are actually the data.
  • variables wrap tensors, and construct a chain of operations between the tensors, so that the gradients can flow back
  • PyTorch requires that the input tensor to be forward propagated has to be wrapped in a Variable.

GPU

check

torch.cuda.is_available()

module

model = torch.nn.DataParallel(model).cuda()

Dataloader
dataloader = torch.utils.data.DataLoader(dataset) dataloader.pin_memory = True #
Tensor conversion

Get value out of torch.cuda.float tensor

  • cuda -> cpu
    • a.data.cpu().numpy()
    • You first need to get tensor out of the variable using .data
    • Then you need to move the tensor to cpu using .cpu()
    • After that you can convert tensor to numpy using .numpy()

Creating tensors on GPU directly

  • torch.cuda.FloatTensor(1000, 1000).fill_(0)
tricks

REF: 知乎
1. torch.backends.cudnn.benchmark = True 在程序刚开始加这条语句可以提升一点训练速度,没什么额外开销。我一般都会加
2. 有时候可能是因为每次迭代都会引入点临时变量,会导致训练速度越来越慢,基本呈线性增长。开发人员还不清楚原因,但如果周期性的使用torch.cuda.empty_cache()的话就可以解决这个问题。这个命令是清除没用的临时变量的。
3. 使用Variable的数据时候要非常小心。不是必要的话尽量使用Tensor来进行计算

for data, label in trainloader:     ......     out = model(data)     loss = criterion(out, label)     # loss_sum += loss     # <--- 这里,错误!!!     loss_sum += loss.data[0]     ......

因为输出的loss的数据类型是Variable,使用Variable计算的时候,会记录下新产生的Variable的运算符号,在反向传播求导的时候进行使用。

nn.Module.train()

对于batchnorm layer和dropout layer

Sequential()

  • A sequential container. Modules will be added to it in the order they are passed in the constructor. Alternatively, an ordered dict of modules can also be passed in.
# Example of using Sequential model = nn.Sequential(           nn.Conv2d(1,20,5),           nn.ReLU(),           nn.Conv2d(20,64,5),           nn.ReLU()         )
>>> l = nn.Linear(2, 2) >>> net = nn.Sequential(l, l) >>> for idx, m in enumerate(net.modules()):         print(idx, '->', m)  0 -> Sequential (   (0): Linear (2 -> 2)   (1): Linear (2 -> 2) ) 1 -> Linear (2 -> 2)
文章来源: pytorch 问题记录
标签
易学教程内所有资源均来自网络或用户发布的内容,如有违反法律规定的内容欢迎反馈
该文章没有解决你所遇到的问题?点击提问,说说你的问题,让更多的人一起探讨吧!