Questions tagged [cross-entropy]

In machine learning and information theory, the cross entropy is a measure of distance (inverse similarity) between two probability distributions over the same underlying set of events. Cross entropy is the common choice of the loss function in neural networks for classification tasks.

360 questions
0
votes
1 answer

Difference between Logistic Loss and Cross Entropy Loss

I'm confused about logistic loss and cross entropy loss in binary classification scenario. According to Wikipedia (https://en.wikipedia.org/wiki/Loss_functions_for_classification), the logistic loss is defined as: where v=y*y_hat The cross entropy…
0
votes
2 answers

Dimension in Tensorflow / keras and sparse_categorical_crossentropy

I cannot understand how to use tensorflow dataset as input for my model. I have a X as (n_sample, max_sentence_size) and a y as (n_sample) but I cannot match the dimension, I am not sure what tensorflow do internaly. Below you can find a…
thibaultbl
  • 886
  • 1
  • 10
  • 20
0
votes
0 answers

Cross entropy loss returns infinity in neural network

I am implementing a neural network and using a cross-entropy loss function. Cross entropy error is given by: error = - np.sum((actual_Y * np.log2 (Y_pred)) + ((1-actual_Y)* np.log2 (1 - Y_pred))) after few iterations (1- Y_pred) inside…
0
votes
1 answer

categorical_crossentropy expects targets to be binary matrices

First of all I am not a programmer, but I am self-teaching me Deep Learning to undertake a real project with my own dataset. My situation can be broken down as follows: I am trying to undertake a multiclass text classification project. I have a…
0
votes
1 answer

Why does loss decrease but accuracy decreases too (Pytorch, LSTM)?

I have built a model with LSTM - Linear modules in Pytorch for a classification problem (10 classes). I am training the model and for each epoch I output the loss and accuracy in the training set. The ouput is as follows: epoch: 0 start! Loss:…
user12546101
0
votes
1 answer

Using categorical_crossentropy for only two classes

Computer vision and deep learning literature usually say one should use binary_crossentropy for a binary (two-class) problem and categorical_crossentropy for more than two classes. Now I am wondering: is there any reason to not use the latter for a…
Matthias
  • 9,817
  • 14
  • 66
  • 125
0
votes
1 answer

Semantic Segmentation — Usage of Categorical Cross-Entropy in spite of Binary Cross-Entropy for Binary Image Segmentation

I have read both on SO and on CValidated and still feel I do not completely understand the following matter. If I have a binary segmentation (consider a medical problem, where you have healthy and damaged tissue), which loss is better to use, BCE or…
0
votes
1 answer

Keras BinaryCrossentropy loss gives NaN for angular distance between two vectors

I want to train a siamese-LSTM such that the angular distance of two outputs is 1 (low similarity) if the corresponding label is 0 and 0 (high similarity) if the label is 1. I took the formular for angular distance from here:…
Jonathan R
  • 3,652
  • 3
  • 22
  • 40
0
votes
1 answer

Inconsistancies of loss function results with Keras

I am implementing a CNN coupled to a multiple instance learning layer. In brief, I've got this, with C the number of categories: [1 batch of images, 1 label] > CNN > Custom final layer -> [1 vector of size C] My final layer just sums up the…
beardybear
  • 159
  • 2
  • 15
0
votes
2 answers

CNN algorithm predicts value of 1.0 and thus the cross-entropy cost function gives a divide by zero warning

I am using CNN to do binary classification. While the cross-entropy function is calculated by the code: (-1 / m) * np.sum(np.multiply(Y, np.log(AL)) + np.multiply(1 - Y, np.log(1 - AL))) when the algorithm predicts a value of 1.0, this cost…
林文烨
  • 53
  • 5
0
votes
1 answer

How to use torch.nn.CrossEntropyLoss as autoencoder's reconstruction loss?

I want to compute the reconstruction accuracy of my autoencoder using CrossEntropyLoss: ae_criterion = nn.CrossEntropyLoss() ae_loss = ae_criterion(X, Y) where X is the autoencoder's reconstruction and Y is the target (since it is an autoencoder, Y…
0
votes
1 answer

Display loss in a Tensorflow DQN without leaving tf.Session()

I have a DQN all set up and working, but I can't figure out how to display the loss without leaving the Tensorflow session. I first thought it involved creating a new function or class, but I'm not sure where to put it in the code, and what…
Rayna Levy
  • 73
  • 1
  • 1
  • 7
0
votes
1 answer

how to make my logistic regression faster

I have to do simple logistic regression (only in numpy, I can't use pytourch or tensorflow). Data: part of MNIST Goal: I should have accuracy about 86%. Unfortunately i have only about 70%, and my loss function oscillate strangely. It must be sth…
0
votes
0 answers

Weighted binary crossentropy in U-Net (Keras)

I am currently working on a modified version on the U-Net (https://lmb.informatik.uni-freiburg.de/people/ronneber/u-net/) and tried to implement a weighted binary crossentropy loss function in Keras. def weighted_pixelwise_crossentropy(self,…
0
votes
1 answer

Output of custom loss in Keras

I know there are many questions treating custom loss functions in Keras but I've been unable to answer this even after 3 hours of googling. Here is a very simplified example of my problem. I realize this example is pointless but I provide it for…
zii
  • 35
  • 6