I was going through this example of a LSTM language model on github (link). What it does in general is pretty clear to me. But I\'m still struggling to understand what calli
From the [pytorch documentation][1]:
contiguous() → Tensor
Returns a contiguous tensor containing the same data as selftensor. If self tensor is contiguous, this function returns the self tensor.
Where contiguous here means not only contiguous in memory, but also in the same order in memory as the indices order: for example doing a transposition doesn't change the data in memory, it simply changes the map from indices to memory pointers, if you then apply contiguous() it will change the data in memory so that the map from indices to memory location is the canonical one.
[1]: http://pytorch.org/docs/master/tensors.html