It is similar to the tf.gather which returns elements of params as per the indexes specifies by ids.
Ex-
para = tf.constant([10,20,80,40])
id = tf.constant([1,2,3])
print tf.nn.embedding_lookup(param,id).eval()
This will give the output:
[20,80,40]
But,The main function of embedding_lookup is to retrieve rows of the params tensor.This params agrument mya have a list of tensors instead of a single tensor.
Ex-
param1 = tf.constant([10,2])
param2 = tf.constant([20,30])
ids = tf.constant([2,0,2,1,2,3])
result = tf.nn.embedding_lookup([param1, param2], ids)
In this cases, the indexes as specified in ids corresponds to the element of tensors as per the partition strategy. The partition_startegy controls the way how the ids will get distributed among the list, the default partition strategy is ‘mod’.
The mod strategy:
Index 0 correspond to the first element of first tensor while index 1 corresponds to the first element of second tensor and so on.For index n,it cannot correspond to the n+1 tensor since the list params contain only n tensor so the nth index will correspond to the second element of first tensor. Similarly, the index n+1 corresponds to the second element of the second tensor and so on.
Now coming back to the code,
param1 = tf.constant([10,2])
param2 = tf.constant([20,30])
ids = tf.constant([2,0,2,1,2,3])
result = tf.nn.embedding_lookup([param1, param2], ids)
Result:
[2 10 2 20 2 30]
Hope this helps!