6

I have a piece of code that uses sigmoid activation function for classification that outputs [0,1]. But I need a activation function that outputs binary values either 0 or 1.

        x = tf.placeholder("float", [None, COLUMN])
        Wh = tf.Variable(tf.random_normal([COLUMN, UNITS_OF_HIDDEN_LAYER], mean=0.0, stddev=0.05))
        h = tf.nn.sigmoid(tf.matmul(x, Wh))

        Wo = tf.Variable(tf.random_normal([UNITS_OF_HIDDEN_LAYER, COLUMN], mean=0.0, stddev=0.05))
        y = tf.nn.sigmoid(tf.matmul(h, Wo))

        # Objective functions
        y_ = tf.placeholder("float", [None, COLUMN])
        correct_prediction = tf.equal(tf.argmax(y, 1),tf.argmax(y, 1))
        cost = tf.reduce_sum(tf.cast(correct_prediction, "float"))/BATCH_SIZE

Can you please tell me how to replace sigmoid function with binary step one.

user3104352
  • 1,100
  • 1
  • 16
  • 34

2 Answers2

10
y = tf.round(tf.nn.sigmoid(tf.matmul(h,Wo))

that will give you 0 or 1 output.

Ryan Jay
  • 838
  • 8
  • 9
6

You don't need sigmoid in this case. Try relu(sign(x))

Hongyu Sun
  • 71
  • 1