Is it a problem if layers with relu activation produce values of multiples of their magnitude e.g. btw. 0 and 25?

后端 未结 0 1827
栀梦
栀梦 2020-12-17 04:20

I am training a deep neural network which consists of 7 layers (4 x conv2d and 3 fully connected). All layers use relu as activation function. Now I see that the output valu

相关标签:
回答
  • 消灭零回复
提交回复
热议问题