I would like to create a function that for every line of a given data X, is applying the softmax function only for some sampled classes, lets say 2, out of K total classes. In simple python the code seems like that:
def softy(X,W, num_samples):
N = X.shape[0]
K = W.shape[0]
S = np.zeros((N,K))
ar_to_sof = np.zeros(num_samples)
sampled_ind = np.zeros(num_samples, dtype = int)
for line in range(N):
for samp in range(num_samples):
sampled_ind[samp] = randint(0,K-1)
ar_to_sof[samp] = np.dot(X[line],np.transpose(W[sampled_ind[samp]]))
ar_to_sof = softmax(ar_to_sof)
S[line][sampled_ind] = ar_to_sof
return S
S finally would contain zeros, and non_zero values in the indexes defined for every line by the array "samped_ind". I would like to implement this using Tensorflow. The problem is that it contains "advanced" indexing and i cannot find a way using this library to create that.
I am trying that using this code:
S = tf.Variable(tf.zeros((N,K)))
tfx = tf.placeholder(tf.float32,shape=(None,D))
wsampled = tf.placeholder(tf.float32, shape = (None,D))
ar_to_sof = tf.matmul(tfx,wsampled,transpose_b=True)
softy = tf.nn.softmax(ar_to_sof)
r = tf.random_uniform(shape=(), minval=0,maxval=K, dtype=tf.int32)
...
for line in range(N):
sampled_ind = tf.constant(value=[sess.run(r),sess.run(r)],dtype= tf.int32)
Wsampled = sess.run(tf.gather(W,sampled_ind))
sess.run(softy,feed_dict={tfx:X[line:line+1], wsampled:Wsampled})
Everything works until here, but i cannot find a way to do the update that i want in the matrix S, in python code "S[line][sampled_ind] = ar_to_sof ".
How could i make this work?