Does tf.transpose also change the memory (like np.ascontiguousarray)?

最后都变了- 提交于 2021-01-03 07:13:50

问题


If I use tf.transpose does it also change the memory layout?

In numpy it is used the function np.ascontiguousarray.

I mean this would be important if I use cuda. Because it makes a difference if the memory layout is [N C H W] or [N H W C]. (N ... Nr of samples, H ... array height, W ... array width, C ... array depth, e.g. RGB)

How to check this?


回答1:


If you read carefully the documentation you can find the answer:

Numpy Compatibility

In numpy transposes are memory-efficient constant time operations as they simply return a new view of the same data with adjusted strides.

TensorFlow does not support strides, so transpose returns a new tensor with the items permuted.

Hence tf.transpose returns a new tensor with the desired shape (and therefore is inefficient), so yes, it changes the memory layout.

However, instead of using tf.trasnpose you could use tf.reshape for changing the tensor shape without creating a new one



来源:https://stackoverflow.com/questions/49949463/does-tf-transpose-also-change-the-memory-like-np-ascontiguousarray

易学教程内所有资源均来自网络或用户发布的内容,如有违反法律规定的内容欢迎反馈
该文章没有解决你所遇到的问题?点击提问,说说你的问题,让更多的人一起探讨吧!