Why is batch size allocated in GPU?

后端 未结 0 1342
天命终不由人
天命终不由人 2021-01-05 21:55

Given a Keras model (on Colab) that has input shape (None,256,256,3) and batch_size is 16 then the memory allocated for that input shape is 16*256*256*3*datatype (datatype=2

相关标签:
回答
  • 消灭零回复
提交回复
热议问题