Bfloat16 training in GPUs

独自空忆成欢 提交于 2019-12-11 05:18:37

问题


Hi I am trying to train a model using the new bfloat16 datatype variables. I know this is supported in Google TPUs. I was wondering if anyone has tried training using GPUs (for example, GTX 1080 Ti). Is that even possible, whether the GPU tensor cores are supportive? If anyone has any experience please share your thoughts. Many thanks!


回答1:


I had posted this question in Tensorflow github community. Here is their response so far - " bfloat16 support isn't complete for GPUs, as it's not supported natively by the devices.

For performance you'll want to use float32 or float16 for GPU execution (though float16 can be difficult to train models with). TPUs support bfloat16 for effectively all operations (but you currently have to migrate your model to work on the TPU). "



来源:https://stackoverflow.com/questions/51602313/bfloat16-training-in-gpus

易学教程内所有资源均来自网络或用户发布的内容,如有违反法律规定的内容欢迎反馈
该文章没有解决你所遇到的问题?点击提问,说说你的问题,让更多的人一起探讨吧!