Activation function after pooling layer or convolutional layer?

后端 未结 2 1196
夕颜
夕颜 2020-12-28 14:59

The theory from these links show that the order of Convolutional Network is: Convolutional Layer - Non-linear Activation - Pooling Layer.

  1. Neural n
2条回答
  •  自闭症患者
    2020-12-28 15:12

    In many papers people use conv -> pooling -> non-linearity. It does not mean that you can't use another order and get reasonable results. In case of max-pooling layer and ReLU the order does not matter (both calculate the same thing):

    You can proof that this is the case by remembering that ReLU is an element-wise operation and a non-decreasing function so

    The same thing happens for almost every activation function (most of them are non-decreasing). But does not work for a general pooling layer (average-pooling).


    Nonetheless both orders produce the same result, Activation(MaxPool(x)) does it significantly faster by doing less amount of operations. For a pooling layer of size k, it uses k^2 times less calls to activation function.

    Sadly this optimization is negligible for CNN, because majority of the time is used in convolutional layers.

提交回复
热议问题