So my problem seems to be an easy one but I can't figure out the syntax for python tensorflow. I have a simple neural network with an input layer, one hidden layer and one output layer. The output layer consists of two neurons. So here is the problem: the first output neuron I want to keep linear, while the second output neuron should have an sigmoidal activation function. I found that there is no such thing as "sliced assignments" in tensorflow but I did not find any work-around.
Here an example snippet:
def multilayer_perceptron(x, weights, biases,act_fct):
layer_1 = tf.add(tf.matmul(x, weights['h1']), biases['b1'], name='hidden_layer_op')
if (act_fct == 'sigmoid'):
layer_1 = tf.nn.sigmoid(layer_1)
print 'sigmoid'
elif (act_fct == 'relu'):
print 'relu'
layer_1 = tf.nn.relu(layer_1)
elif (act_fct == 'linear'):
print 'linear'
else :
print 'Unknown activation function'
sys.exit()
out_layer = tf.add(tf.matmul(layer_1, weights['out']), biases['out'], name='output_layer_op')
##DOES NOT WORK!
out_layer[1] = tf.nn.sigmoid(out_layer[1])
return out_layer
I am sure there is a very simple way to do this. However hopefully someone can help me with that. P.S. (all the variables passed to the function have been initialized accordingly beforehand)
best regards and thanks!