Questions tagged [cross-entropy]

In machine learning and information theory, the cross entropy is a measure of distance (inverse similarity) between two probability distributions over the same underlying set of events. Cross entropy is the common choice of the loss function in neural networks for classification tasks.

360 questions
2
votes
1 answer

TensorFlow sequence_loss with label_smoothing

Would it be possible to use the label_smoothing feature from tf.losses.softmax_cross_entropy with tf.contrib.seq2seq.sequence_loss ? I can see that sequence_loss optionally takes a softmax_loss_function as parameter. However, this function would…
2
votes
2 answers

log_loss in sklearn: Multioutput target data is not supported with label binarization

The following code from sklearn import metrics import numpy as np y_true = np.array([[0.2,0.8,0],[0.9,0.05,0.05]]) y_predict = np.array([[0.5,0.5,0.0],[0.5,0.4,0.1]]) metrics.log_loss(y_true, y_predict) produces the following error: …
user1700890
  • 7,144
  • 18
  • 87
  • 183
2
votes
2 answers

Does sigmoid_cross_entropy produce the mean loss over the whole batch?

I have a multi-label classification task and there are 6 labels. Any sample may have none or some labels to be 1. I have use the loss in tensorflow: self.loss = tf.losses.sigmoid_cross_entropy(self.labels, self.logits) Every time a batch (1000) of…
2
votes
1 answer

How is Cross Entropy Loss Converted to a Scalar During Optimization?

I have a basic beginner question about how neural networks are defined, and I am learning in the context of the Keras library. Following the MNIST hello world program, I have defined this network: model = Sequential() model.add(Dense(NB_CLASSES,…
2
votes
0 answers

tensorflow different computer same code different results

While using tf.nn.softmax_cross_entropy_with_logits in training process, the result frequently gives nan or exceptionally big cross entropy values. (Windows 7 64bit Python 3.6 (Anaconda 4.4.0) Tensorflow 1.4 NVIDIA Titan X Pascal CUDA 8.0 CUDNN…
JH Jung
  • 21
  • 1
2
votes
0 answers

Tried to modify categorical_crossentropy loss in keras with class weights, but does not work during training

I'm doing semantic segmentation in keras and tried to modify the categorical_crossentropy loss so that the loss is class-weighted. Here is my code: def class_weighted_categorical_crossentropy(output, target, from_logits=False): """Categorical…
nubs91
  • 21
  • 2
2
votes
1 answer

xgboost: Huge logloss despite reasonable accuracy

I train a xgboost classifier on a binary classification problem. It produces 70% accurate predictions. Yet logloss is very big at 9.13. I suspect that might be because a few predictions are very much off the target, but I do not understand why it…
ikamen
  • 3,175
  • 1
  • 25
  • 47
2
votes
1 answer

Why do RNNs use the crossentropy as a loss function

I am very new to neural networks and was wondering why all of the examples of RNNs, especially char-rnns use the crossentropy loss function as their loss function. I have googled but can't seem to come across any discussions on the function in this…
2
votes
1 answer

categorical_crossentropy returns small loss value even if accuracy is 1.00 in keras

I have a LSTM model which is designed for multi-classification problem. When training, the accuracy is acutally 1.00. But still returns small loss value. What does it mean? All targets are predicted correctly. Why can not the loss value be…
jef
  • 3,890
  • 10
  • 42
  • 76
2
votes
1 answer

Tensorflow entropy in NaN for large inputs when training CNN

I've created a simple convolution neuron network with TensorFlow. When I use input images with edge = 32px the network works fine, but if I increase edge twice to 64px then entropy retutrs as NaN. The question is how to fix that? CNN structure is…
Verych
  • 431
  • 1
  • 5
  • 12
2
votes
0 answers

Python NLTK: What's the difference between total entropy and per-word entropy?

I'm required to find both the total cross entropy and per-word cross entropy of a given text using NLTK. Specifically I'm using the entropy function here... http://www.nltk.org/_modules/nltk/model/ngram.html ...but I'm unsure whether this calculates…
Wolff
  • 1,051
  • 3
  • 18
  • 31
1
vote
0 answers

Triplet Loss with Cross Entropy Loss?

I'm trying to learn how to use a TripletLoss in a Siamese Network. My goal is to build a classification siamese model, so I suppose I need both a Triplet Loss to minimize distances and a Cross Entropy Loss for classification. I managed to build a…
1
vote
1 answer

Where are the actual predictions stored for Tensorflow keras CategoricalCrossentrophy model?

I'm learning about python and machine learning and reproduced some published code in a Kaggle notebook and modified it for my data within Azure Data Studio running Python 3. (Removed externally located code as per request in comments). The code…
1
vote
0 answers

RuntimeError: cuDNN error: CUDNN_STATUS_MAPPING_ERROR

Why is there no way to calculate the loss value? (About CrossEntropyLoss) My code is a binary classification problem. I try to calculate the loss value in the final test stage, and finally use the loss value to draw a histogram. However, there will…
蘇煥淇
  • 11
  • 1
1
vote
1 answer

analyze the train-validation accuracy learning curve

I am building a two-layer neural network from scratch on the Fashion MNIST dataset. In between, using the RELU as activation and on the last layer, I am using softmax cross entropy. I am getting the below learning curve between train and validation…
ffl
  • 91
  • 1
  • 4