Computing jacobian matrix in Tensorflow

☆樱花仙子☆ 提交于 2019-12-18 13:48:06

问题


I want to calculate Jacobian matrix by Tensorflow.

What I have:

def compute_grads(fn, vars, data_num):
    grads = []
    for n in range(0, data_num):
        for v in vars:
            grads.append(tf.gradients(tf.slice(fn, [n, 0], [1, 1]), v)[0])
    return tf.reshape(tf.stack(grads), shape=[data_num, -1])

fn is a loss function, vars are all trainable variables, and data_num is a number of data.

But if we increase the number of data, it takes tremendous time to run the function compute_grads. Any ideas?


回答1:


Assuming that X and Y are Tensorflow tensors and that Y depends on X:

from tensorflow.python.ops.parallel_for.gradients import jacobian
J=jacobian(Y,X)

The result has the shape Y.shape + X.shape and provides the partial derivative of each element of Y with respect to each element of X.




回答2:


Assuming you are using Tensorflow 2 or Tensorflow <2 and Eager mode, you can use the GradientTape and the inbuild function:

with tf.GradientTape() as g:
  x  = tf.constant([1.0, 2.0])
  g.watch(x)
  y = x * x
jacobian = g.jacobian(y, x)
# jacobian value is [[2., 0.], [0., 4.]]

Check the official documentation for more



来源:https://stackoverflow.com/questions/50244270/computing-jacobian-matrix-in-tensorflow

易学教程内所有资源均来自网络或用户发布的内容,如有违反法律规定的内容欢迎反馈
该文章没有解决你所遇到的问题?点击提问,说说你的问题,让更多的人一起探讨吧!