How to check if pytorch is using the GPU?

后端 未结 10 1252
既然无缘
既然无缘 2020-12-04 05:01

I would like to know if pytorch is using my GPU. It\'s possible to detect with nvidia-smi if there is any activity from the GPU during the process,

10条回答
  •  温柔的废话
    2020-12-04 05:12

    After you start running the training loop, if you want to manually watch it from the terminal whether your program is utilizing the GPU resources and to what extent, then you can simply use watch as in:

    $ watch -n 2 nvidia-smi
    

    This will continuously update the usage stats for every 2 seconds until you press ctrl+c


    If you need more control on more GPU stats you might need, you can use more sophisticated version of nvidia-smi with --query-gpu=.... Below is a simple illustration of this:

    $ watch -n 3 nvidia-smi --query-gpu=index,gpu_name,memory.total,memory.used,memory.free,temperature.gpu,pstate,utilization.gpu,utilization.memory --format=csv
    

    which would output the stats something like:

    Note: There should not be any space between the comma separated query names in --query-gpu=.... Else those values will be ignored and no stats are returned.


    Also, you can check whether your installation of PyTorch detects your CUDA installation correctly by doing:

    In [13]: import  torch
    
    In [14]: torch.cuda.is_available()
    Out[14]: True
    

    True status means that PyTorch is configured correctly and is using the GPU although you have to move/place the tensors with necessary statements in your code.


    If you want to do this inside Python code, then look into this module:

    https://github.com/jonsafari/nvidia-ml-py or in pypi here: https://pypi.python.org/pypi/nvidia-ml-py/

提交回复
热议问题