How to check if pytorch is using the GPU?

后端 未结 10 1250
既然无缘
既然无缘 2020-12-04 05:01

I would like to know if pytorch is using my GPU. It\'s possible to detect with nvidia-smi if there is any activity from the GPU during the process,

10条回答
  •  野趣味
    野趣味 (楼主)
    2020-12-04 05:23

    To check if there is a GPU available:

    torch.cuda.is_available()
    

    If the above function returns False,

    1. you either have no GPU,
    2. or the Nvidia drivers have not been installed so the OS does not see the GPU,
    3. or the GPU is being hidden by the environmental variable CUDA_VISIBLE_DEVICES. When the value of CUDA_VISIBLE_DEVICES is -1, then all your devices are being hidden. You can check that value in code with this line: os.environ['CUDA_VISIBLE_DEVICES']

    If the above function returns True that does not necessarily mean that you are using the GPU. In Pytorch you can allocate tensors to devices when you create them. By default, tensors get allocated to the cpu. To check where your tensor is allocated do:

    # assuming that 'a' is a tensor created somewhere else
    a.device  # returns the device where the tensor is allocated
    

    Note that you cannot operate on tensors allocated in different devices. To see how to allocate a tensor to the GPU, see here: https://pytorch.org/docs/stable/notes/cuda.html

提交回复
热议问题