TensorRT Engine File can't inference with tritonserver but can't be used with python

前端 未结 0 1062
傲寒
傲寒 2021-01-01 05:15

My Env:

  • centos 7
  • nvidia 440.36
  • cuda 10.2
  • cudnn 8
  • TensorRT-7.1.3.4
  • cuda 10.2
  • nvcr.io/nvidia/tritonserver 20
相关标签:
回答
  • 消灭零回复
提交回复
热议问题