问题
If I use tf.transpose
does it also change the memory layout?
In numpy it is used the function np.ascontiguousarray
.
I mean this would be important if I use cuda. Because it makes a difference if the memory layout is [N C H W] or [N H W C]. (N ... Nr of samples, H ... array height, W ... array width, C ... array depth, e.g. RGB)
How to check this?
回答1:
If you read carefully the documentation you can find the answer:
Numpy Compatibility
In numpy transposes are memory-efficient constant time operations as they simply return a new view of the same data with adjusted strides.
TensorFlow does not support strides, so transpose returns a new tensor with the items permuted.
Hence tf.transpose
returns a new tensor with the desired shape (and therefore is inefficient), so yes, it changes the memory layout.
However, instead of using tf.trasnpose
you could use tf.reshape for changing the tensor shape without creating a new one
来源:https://stackoverflow.com/questions/49949463/does-tf-transpose-also-change-the-memory-like-np-ascontiguousarray