In machine learning and information theory, the cross entropy is a measure of distance (inverse similarity) between two probability distributions over the same underlying set of events. Cross entropy is the common choice of the loss function in neural networks for classification tasks.
Questions tagged [cross-entropy]
360 questions
1
vote
2 answers
Difference between logloss in sklearn and BCEloss in Pytorch?
Looking at the documentation for logloss in Sklearn and BCEloss in Pytorch, these should be the same, i.e. just the normal log loss with weights applied. However, they behave differently - both with and without weights applied. Can anyone explain it…

Peter Alexander
- 101
- 3
- 10
1
vote
1 answer
What model (loss function, etc) can be used in Keras regarding to categorical training with probability labels instead of one-hot encoding
I came to a problem when designing my keras model.
The training data(input) to the model is 2 sequential character-encoded lists and a non-sequential normal feature list. The output is a list of probabilities of 5 different classes. The testing data…

Jingwu
- 11
- 2
1
vote
0 answers
Sparse categorical crossentropy loss in tflearn
I was translating the mnist example in https://www.tensorflow.org/tutorials/ to tflearn from keras. However, in tflearn, there was no sparse_categorical_crossentropy, only categorical_crossentropy. Thus I had to convert all the y vectors (which were…

Jonathan Lindgren
- 1,192
- 3
- 14
- 31
1
vote
1 answer
Does tensorflow compute the cross entropy only with single precision?
I am trying to fully understand the computation of the cross entropy in TensorFlow. In the following piece of code, with numpy I generate double precision random double data x, transform it to logits for binary classification (i.e., only one logit…

Mauricio Fernández
- 179
- 1
- 10
1
vote
0 answers
Output Format using Sparse Categorical Cross Entropy in Keras for Multi-Class Classification
I've built a u-net architecture using Keras Functional API but I'm having trouble using the sparse categorical cross entropy loss function. My learning task is multi-class, pixel-wise classification for many 256x256 images. The intended output is a…

A. LaBella
- 427
- 1
- 4
- 13
1
vote
1 answer
Multinomial naive bayes softmax altering
In scikit learn, I was doing multi-class classification using MultinomialNB for labelled text data.
While predicting I used "predict_proba" feature of multinomialNB
clf=MultinomialNB()
print(clf.fit(X_train,Y_train))
clf.predict_proba(X_test[0])
As…

Meenakshisundaram
- 55
- 3
1
vote
0 answers
Weighted binary cross entropy dice loss for segmentation problem
I am using weighted Binary cross entropy Dice loss for a segmentation problem with class imbalance (80 times more black pixels than white pixels) .
def weighted_bce_dice_loss(y_true, y_pred):
y_true = K.cast(y_true, 'float32')
y_pred =…

AKSHAYAA VAIDYANATHAN
- 2,715
- 7
- 30
- 51
1
vote
2 answers
How do I apply the binary cross-entropy element-wise and then sum all these losses in Keras?
I want to write a function with two arguments, A and B, tensors of the same shape (for example, 13x13, or some other shape), and that returns a number that represents the summation of all losses when applied binary cross-entropy componentwise. So,…

Alem
- 283
- 1
- 13
1
vote
2 answers
MemoryError when calling to_categorical in keras
I try to run the language modeling program. When I use the data train with 15000 sentences in a document, the program running properly. But, when I try to change the data with the bigger one (10 times bigger) it's encountered an error as…

Bily
- 51
- 9
1
vote
0 answers
how to fusion multiple inputs using entropy?
I want to know how can I use the entropy instead of summation in Tensorflow.
I have a code for object classification and I use summation operation to sum two Tensors.
For example::
layer1 = tf.nn.conv2d(inp, [1,1,3,32], [1,1,1,1], 'SAME')
layer2 =…

programmer
- 577
- 1
- 9
- 21
1
vote
0 answers
Error is halting to a constant value in neural net for MNIST data
Edit:
I made following changes and got an accuracy of about 80%.
Changed epochs to 2300
changed learning rate to 0.000006
changed np.random.rand to np.random.randn
epochs and learning rate i can understand, but i have no idea whats up with rand…

blaze
- 13
- 1
- 1
- 5
1
vote
2 answers
NaN with softmax cross entropy in simple model with dummy inputs
I was simplifying my model in order to see where the NaN error occurs and narrowed it down to my loss function:
import tensorflow as tf
from tensorflow.python import debug as tf_debug
def train_input_fn():
pass
def model_fn(features, labels,…

user2368505
- 416
- 3
- 16
1
vote
2 answers
Keras loss consistently low but accuracy starts high then drops
First off my assumptions might be wrong:
Loss is how far from the correct answer each training example is (then divided by the number of examples - kind of a mean loss).
Accuracy is how many training examples are correct (if the highest output is…

FraserOfSmeg
- 1,128
- 2
- 23
- 41
1
vote
1 answer
In pytorch, how to use the weight parameter in F.cross_entropy()?
I'm trying to write some code like below:
x = Variable(torch.Tensor([[1.0,2.0,3.0]]))
y = Variable(torch.LongTensor([1]))
w = torch.Tensor([1.0,1.0,1.0])
F.cross_entropy(x,y,w)
w = torch.Tensor([1.0,10.0,1.0])
F.cross_entropy(x,y,w)
However, the…

konchy
- 573
- 5
- 16
1
vote
1 answer
Logits representation in TensorFlow’s sparse_softmax_cross_entropy
I’ve a question regarding to the sparse_softmax_cross_entropy cost function in TensorFlow.
I want to use it in a semantic segmentation context where I use an autoencoder architecture which uses typical convolution operations to downsample images to…

Bastian
- 1,553
- 13
- 33