tf.nn.embedding_lookup(params, ids, partition_strategy=\'mod\', name=None)
I cannot understand the duty of this function. Is it like a lookup table
Since I was also intrigued by this function, I'll give my two cents.
The way I see it in the 2D case is just as a matrix multiplication (it's easy to generalize to other dimensions).
Consider a vocabulary with N symbols. Then, you can represent a symbol x as a vector of dimensions Nx1, one-hot-encoded.
But you want a representation of this symbol not as a vector of Nx1, but as one with dimensions Mx1, called y.
So, to transform x into y, you can use and embedding matrix E, with dimensions MxN:
y = E x.
This is essentially what tf.nn.embedding_lookup(params, ids, ...) is doing, with the nuance that ids are just one number that represents the position of the 1 in the one-hot-encoded vector x.