What is the best way to implement weight constraints in TensorFlow?

前端 未结 2 1820
南方客
南方客 2020-12-23 04:04

Suppose we have weights

x = tf.Variable(np.random.random((5,10)))
cost = ...

And we use the GD optimizer:

upds = tf.train.G         


        
2条回答
  •  渐次进展
    2020-12-23 04:32

    As of TensorFlow 1.4, there is a new argument to tf.get_variable that allows to pass a constraint function that is applied after the update of the optimizer. Here is an example that enforces a non-negativity constraint:

    with tf.variable_scope("MyScope"):
      v1 = tf.get_variable("v1", …, constraint=lambda x: tf.clip_by_value(x, 0, np.infty))
    

    constraint: An optional projection function to be applied to the variable after being updated by an Optimizer (e.g. used to implement norm constraints or value constraints for layer weights). The function must take as input the unprojected Tensor representing the value of the variable and return the Tensor for the projected value (which must have the same shape). Constraints are not safe to use when doing asynchronous distributed training.

提交回复
热议问题