I am trying to get a neural network going in TensforFlow. The dataset is simply length and width of a flower petal and the output can be either 1/0 depending on type:
x = [[3,1.5],
[2,1],
[4,1.5],
[3,1],
[3.5,0.5],
[2,0.5],
[5.5,1],
[1,1]]
y = [1,
0,
1,
0,
1,
0,
1,
0]
So far my code looks something like this:
define variables
x_1 = tf.placeholder(tf.float32, shape=[8,2])
y_1 = tf.placeholder(tf.float32, shape=[8])
w_1 = tf.placeholder(tf.float32, shape=[2,8])
b_1 = tf.placeholder(tf.float32, shape=[8,])
sess = tf.Session()
sess.run(tf.global_variables_initializer())
y_ = tf.matmul(x_1,w_1) + b
sigmoid = tf.nn.sigmoid(y_)
train_step = tf.train.GradientDescentOptimizer(0.5).minimize(sigmoid)
for _ in range(50000):
My question is how would I arrange my 'for' loop so that it takes the whole dataset at once and compares it to the actual output? The mnist dataset on tensorflow uses softmax cross entropy whereby you can specify the actual output and predicted output in the parameters of the function. However in this simple dataset how would I replicate the same in the remaining for loop so that the code grabs all data makes a prediction and compares it to the actual output? Also please specify if there is any problems with the shape of my variables thank you.