Questions tagged [cross-entropy]

In machine learning and information theory, the cross entropy is a measure of distance (inverse similarity) between two probability distributions over the same underlying set of events. Cross entropy is the common choice of the loss function in neural networks for classification tasks.

360 questions
0
votes
0 answers

How can I implement tf.nn.sparse_softmax_cross_entropy by my self

I want to implement tf.nn.sparse_softmax_cross_entropy by my self. But after some batches, loss became nan! There is my code: logits_batch_size = tf.shape(logits)[0] labels = tf.reshape(tgt_seq, [-1]) eps =…
0
votes
2 answers

understanding tensor flow function output

Could someone explain why the following code generates the output of array([ 0.59813887, 0.69314718], dtype=float32) ? For example, numpy.log(0.5) = 0.69314718, but how does the 0.598138 come from ? import tensorflow as tf res1 =…
sunxd
  • 743
  • 1
  • 9
  • 24
0
votes
0 answers

Using `softmax_cross_entropy_with_logits()` with `seq2seq.sequence_loss()`

I have a working RNN using the default softmax loss function for tf.contrib.seq2seq.sequence_loss() (which I'm assuming is tf.nn.softmax()) but would instead like to use tf.nn.softmax_cross_entropy_with_logits(). According to the…
Mark Cramer
  • 2,614
  • 5
  • 33
  • 57
0
votes
1 answer

tf.nn.sparse_softmax_cross_entropy_with_logits - rank error

Here is my code: import tensorflow as tf with tf.Session() as sess: y = tf.constant([0,0,1]) x = tf.constant([0,1,0]) r = tf.nn.sparse_softmax_cross_entropy_with_logits(labels=y, logits=x) sess.run() …
user1700890
  • 7,144
  • 18
  • 87
  • 183
0
votes
1 answer

Tensorflow: Output probabilities from sigmoid cross entropy loss

I have a CNN for a multilabel classification problem and as a loss function I use the tf.nn.sigmoid_cross_entropy_with_logits . From the cross entropy equation I would expect that the output would be probabilities of each class but instead I get…
Mewtwo
  • 1,231
  • 2
  • 18
  • 38
0
votes
1 answer

softmax cross entropy return value

What does it mean if this is the return value for tf.losses.softmax_cross_entropy_loss? Does the fact that is states value:0 mean and shape=() mean that nothing was computed?
haxtar
  • 1,962
  • 3
  • 20
  • 44
0
votes
1 answer

Keras custom loss function dtype error

I have a NN that has two identical CNN (similar to Siamese network), then merges the outputs, and intends to apply a custom loss function on the merged output, something like this: ----------------- ----------------- | input_a …
Salman
  • 1
  • 2
0
votes
1 answer

I can't get Caffe working

After some struggling, I decided to try a most simple task, training a network to classify weither a number is non-negtive. And I failed... I generated the data with following code. And I'm not sure if it is right. I read the data back from the…
0
votes
0 answers

how the cross-entropy speed up backpropagation on the hidden layer?

I am learning the http://neuralnetworksanddeeplearning.com/chap3.html. It says the cross-entropy cost function can speed up the network, because the δ'(z) canceled on the last layer. the partial derivative for the last layer L: ∂(C)/∂(w) =…
Joey
  • 175
  • 7
0
votes
0 answers

I'm having nan in Cross entropy which is not reasonable

I'm tracing back the problem in my network to detect the reason of having nan from the cross entropy. I took the values from the network where exactly I start to have Nan and test them alone on another function. Here I'm testing the 50 values…
Feras
  • 834
  • 7
  • 18
0
votes
1 answer

Does binary log loss exclude one part of equation based on y?

Assuming the log loss equation to be: logLoss=−(1/N)*∑_{i=1}^N (yi(log(pi))+(1−yi)log(1−pi)) where N is number of samples, yi...yiN is the actual value of the dependent variable, and pi...piN is the predicted likelihood from logistic regression How…
0
votes
1 answer

Relationship between Crossentroy Loss and Accurary

I am training some CNN on an image classification task. On a simple version this worked fine, but when I made the images more difficult I now encounter this phenomen (I let it train over night): While training, the training crossentropy loss goes…
0
votes
0 answers

Caffe produces negative loss values (Multi label classification with lmdb)

I am trying to do multi label classification based on the lmdb database. I create two different databases. One for the images itself and one for the labels. My intention is to have 2 different labels for angles in horizontal and vertical direction.…
user4911648
0
votes
0 answers

Tensorflow tf.nn.in_top_k: targets out of range error?

I have figured out what was causing this error, it was due to mismatch between labels and outputs, like I'm doing 8 class sentiment classification and my labels are (1,2,3,4,7,8,9,10) so it was unable to match predictions(1,2,3,4,5,6,7,8) with my…
Gary Grey
  • 145
  • 1
  • 7
0
votes
1 answer

tensorflow: understanding cross entropy calculation with reduce_mean and reduce_sum

I was looking at the Tensorflow's basic neural network for beginners [1]. I am having trouble understanding the calculation of the entropy value and how its used. In the example a place holder is created to hold the correct labels: y_ =…
user2051561
  • 838
  • 1
  • 7
  • 21
1 2 3
23
24