How to add regularizations in TensorFlow?

后端 未结 10 1339
慢半拍i
慢半拍i 2020-12-07 06:55

I found in many available neural network code implemented using TensorFlow that regularization terms are often implemented by manually adding an additional term to loss valu

10条回答
  •  庸人自扰
    2020-12-07 07:39

    tf.GraphKeys.REGULARIZATION_LOSSES will not be added automatically, but there is a simple way to add them:

    reg_loss = tf.losses.get_regularization_loss()
    total_loss = loss + reg_loss
    

    tf.losses.get_regularization_loss() uses tf.add_n to sum the entries of tf.GraphKeys.REGULARIZATION_LOSSES element-wise. tf.GraphKeys.REGULARIZATION_LOSSES will typically be a list of scalars, calculated using regularizer functions. It gets entries from calls to tf.get_variable that have the regularizer parameter specified. You can also add to that collection manually. That would be useful when using tf.Variable and also when specifying activity regularizers or other custom regularizers. For instance:

    #This will add an activity regularizer on y to the regloss collection
    regularizer = tf.contrib.layers.l2_regularizer(0.1)
    y = tf.nn.sigmoid(x)
    act_reg = regularizer(y)
    tf.add_to_collection(tf.GraphKeys.REGULARIZATION_LOSSES, act_reg)
    

    (In this example it would presumably be more effective to regularize x, as y really flattens out for large x.)

提交回复
热议问题