How to add regularizations in TensorFlow?

后端 未结 10 1354
慢半拍i
慢半拍i 2020-12-07 06:55

I found in many available neural network code implemented using TensorFlow that regularization terms are often implemented by manually adding an additional term to loss valu

10条回答
  •  长情又很酷
    2020-12-07 07:47

    A few aspects of the existing answer were not immediately clear to me, so here is a step-by-step guide:

    1. Define a regularizer. This is where the regularization constant can be set, e.g.:

      regularizer = tf.contrib.layers.l2_regularizer(scale=0.1)
      
    2. Create variables via:

          weights = tf.get_variable(
              name="weights",
              regularizer=regularizer,
              ...
          )
      

      Equivalently, variables can be created via the regular weights = tf.Variable(...) constructor, followed by tf.add_to_collection(tf.GraphKeys.REGULARIZATION_LOSSES, weights).

    3. Define some loss term and add the regularization term:

      reg_variables = tf.get_collection(tf.GraphKeys.REGULARIZATION_LOSSES)
      reg_term = tf.contrib.layers.apply_regularization(regularizer, reg_variables)
      loss += reg_term
      

      Note: It looks like tf.contrib.layers.apply_regularization is implemented as an AddN, so more or less equivalent to sum(reg_variables).

提交回复
热议问题