tf.nn.embedding_lookup(params, ids, partition_strategy=\'mod\', name=None)
I cannot understand the duty of this function. Is it like a lookup table
Another way to look at it is , assume that you flatten out the tensors to one dimensional array, and then you are performing a lookup
(eg) Tensor0=[1,2,3], Tensor1=[4,5,6], Tensor2=[7,8,9]
The flattened out tensor will be as follows [1,4,7,2,5,8,3,6,9]
Now when you do a lookup of [0,3,4,1,7] it will yeild [1,2,5,4,6]
(i,e) if lookup value is 7 for example , and we have 3 tensors (or a tensor with 3 rows) then,
7 / 3 : (Reminder is 1, Quotient is 2) So 2nd element of Tensor1 will be shown, which is 6