How to calculate the Cosine similarity between two tensors?

前端 未结 3 903
無奈伤痛
無奈伤痛 2020-12-08 11:29

I have two normalized tensors and I need to calculate the cosine similarity between these tensors. How do I do it with TensorFlow?

cosine(normalize_a,normali         


        
相关标签:
3条回答
  • 2020-12-08 12:09

    This will do the job:

    a = tf.placeholder(tf.float32, shape=[None], name="input_placeholder_a")
    b = tf.placeholder(tf.float32, shape=[None], name="input_placeholder_b")
    normalize_a = tf.nn.l2_normalize(a,0)        
    normalize_b = tf.nn.l2_normalize(b,0)
    cos_similarity=tf.reduce_sum(tf.multiply(normalize_a,normalize_b))
    sess=tf.Session()
    cos_sim=sess.run(cos_similarity,feed_dict={a:[1,2,3],b:[2,4,6]})
    

    This prints 0.99999988

    0 讨论(0)
  • 2020-12-08 12:15

    Times change. With the latest TF API, this can be computed by calling tf.losses.cosine_distance.

    Example:

    import tensorflow as tf
    import numpy as np
    
    
    x = tf.constant(np.random.uniform(-1, 1, 10)) 
    y = tf.constant(np.random.uniform(-1, 1, 10))
    s = tf.losses.cosine_distance(tf.nn.l2_normalize(x, 0), tf.nn.l2_normalize(y, 0), dim=0)
    print(tf.Session().run(s))
    

    Of course, 1 - s is the cosine similarity!

    0 讨论(0)
  • 2020-12-08 12:28

    You can normalize you vector or matrix like this:

    [batch_size*hidden_num]
    states_norm=tf.nn.l2_normalize(states,dim=1)
    [batch_size * embedding_dims]
    embedding_norm=tf.nn.l2_normalize(embedding,dim=1)
    #assert hidden_num == embbeding_dims
    after mat [batch_size*embedding]
    user_app_scores = tf.matmul(states_norm,embedding_norm,transpose_b=True)
    
    0 讨论(0)
提交回复
热议问题