36

I'm facing a trouble with tensorFlow. Executing the following code

import tensorflow as tf
import input_data

learning_rate = 0.01
training_epochs = 25
batch_size = 100
display_step = 1

mnist = input_data.read_data_sets("MNIST_data/", one_hot=True)

# tensorflow graph input
X = tf.placeholder('float', [None, 784]) # mnist data image of shape 28 * 28 = 784
Y = tf.placeholder('float', [None, 10]) # 0-9 digits recognition = > 10 classes

# set model weights
W = tf.Variable(tf.zeros([784, 10]))
b = tf.Variable(tf.zeros([10]))

# Our hypothesis
activation = tf.add(tf.matmul(X, W),b)  # Softmax

# Cost function: cross entropy
cost = tf.reduce_mean(tf.nn.softmax_cross_entropy_with_logits(labels=activation, logits=Y))
optimizer = tf.train.AdamOptimizer(learning_rate=learning_rate).minimize(cost)  # Gradient Descen

I get the following error:

ValueError: No gradients provided for any variable, check your graph for ops that do not support gradients, between variables ['Tensor("Variable/read:0", shape=(784, 10), dtype=float32)', 'Tensor("Variable_1/read:0", shape=(10,), dtype=float32)'] and loss Tensor("Mean:0", shape=(), dtype=float32).

Sociopath
  • 13,068
  • 19
  • 47
  • 75
kevin moon
  • 381
  • 1
  • 3
  • 5
  • 7
    I solved this problem by change of args.. (labels=activation, logits=Y) → (labels=Y, logits=activation) it was a logic problem. thanks – kevin moon Jan 17 '17 at 05:46

3 Answers3

24

This problem is caused by the following line: tf.nn.softmax_cross_entropy_with_logits(labels=activation, logits=Y)

Based on documentation you should have

labels: Each row labels[i] must be a valid probability distribution.

logits: Unscaled log probabilities.

So logits suppose to be your hypothesis and thus equal to activation and valid probability distribution is Y. So just change it with tf.nn.softmax_cross_entropy_with_logits(labels=Y, logits=activation)

Community
  • 1
  • 1
Salvador Dali
  • 214,103
  • 147
  • 703
  • 753
20

I ended up here because I had passed my input X data to my model, but not my expected outputs. I had:

model.fit(X, epochs=30)      # whoops!

I should have had:

model.fit(X, y, epochs=30)   # fixed!
duhaime
  • 25,611
  • 17
  • 169
  • 224
0

In my case I've forgotten to add the compile layer to the model

model.compile(optimizer='adam',  loss = 'categorical_crossentropy', metrics = ['accuracy'] )
Omar Essam El-Din
  • 1,415
  • 14
  • 17