Pytorch preferred way to copy a tensor

前端 未结 4 2175
予麋鹿
予麋鹿 2020-12-23 18:40

There seems to be several ways to create a copy of a tensor in Pytorch, including

y = tensor.new_tensor(x) #a

y = x.clone().detach() #b

y = torch.empty_lik         


        
4条回答
  •  心在旅途
    2020-12-23 19:35

    One example to check if the tensor is copied:

    import torch
    def samestorage(x,y):
        if x.storage().data_ptr()==y.storage().data_ptr():
            print("same storage")
        else:
            print("different storage")
    a = torch.ones((1,2), requires_grad=True)
    print(a)
    b = a
    c = a.data
    d = a.detach()
    e = a.data.clone()
    f = a.clone()
    g = a.detach().clone()
    i = torch.empty_like(a).copy_(a)
    j = torch.tensor(a) # UserWarning: To copy construct from a tensor, it is recommended to use sourceTensor.clone().detach() or sourceTensor.clone().detach().requires_grad_(True), rather than torch.tensor(sourceTensor).
    
    
    print("a:",end='');samestorage(a,a)
    print("b:",end='');samestorage(a,b)
    print("c:",end='');samestorage(a,c)
    print("d:",end='');samestorage(a,d)
    print("e:",end='');samestorage(a,e)
    print("f:",end='');samestorage(a,f)
    print("g:",end='');samestorage(a,g)
    print("i:",end='');samestorage(a,i)
    
    

    Out:

    tensor([[1., 1.]], requires_grad=True)
    a:same storage
    b:same storage
    c:same storage
    d:same storage
    e:different storage
    f:different storage
    g:different storage
    i:different storage
    j:different storage
    

    The tensor is copied if the different storage shows up. PyTorch has almost 100 different constructors, so you may add many more ways.

    If I would need to copy a tensor I would just use copy(), this copies also the AD related info, so if I would need to remove AD related info I would use:

    y = x.clone().detach()
    

提交回复
热议问题