What does tf.nn.embedding_lookup function do?

前端 未结 8 584
深忆病人
深忆病人 2020-12-02 04:18
tf.nn.embedding_lookup(params, ids, partition_strategy=\'mod\', name=None)

I cannot understand the duty of this function. Is it like a lookup table

8条回答
  •  既然无缘
    2020-12-02 04:29

    embedding_lookup function retrieves rows of the params tensor. The behavior is similar to using indexing with arrays in numpy. E.g.

    matrix = np.random.random([1024, 64])  # 64-dimensional embeddings
    ids = np.array([0, 5, 17, 33])
    print matrix[ids]  # prints a matrix of shape [4, 64] 
    

    params argument can be also a list of tensors in which case the ids will be distributed among the tensors. For example, given a list of 3 tensors [2, 64], the default behavior is that they will represent ids: [0, 3], [1, 4], [2, 5].

    partition_strategy controls the way how the ids are distributed among the list. The partitioning is useful for larger scale problems when the matrix might be too large to keep in one piece.

提交回复
热议问题