Questions tagged [cross-entropy]

In machine learning and information theory, the cross entropy is a measure of distance (inverse similarity) between two probability distributions over the same underlying set of events. Cross entropy is the common choice of the loss function in neural networks for classification tasks.

360 questions
0
votes
1 answer

Weights to Crossentropy loss

Let's say I have two classes, class A (100 samples) and class B (100 samples) in training but while testing the class A has (1000 samples) and class B has (100 samples). How am I supposed to calculate and use weights for weighted CrossEntropy Loss.…
amy
  • 342
  • 1
  • 5
  • 18
0
votes
1 answer

tf.losses.log_loss and tf.nn.softmax in Tensorflow and Pytorch

I am trying to implement a network which has the following loss function definition in Pytorch logits = F.log_softmax(layer_output) loss = F.nll_loss(logits, labels) This link…
0
votes
1 answer

Cross Entropy Loss for One Hot Encoding

CE-loss sums up the loss over all output nodes Sum_i[ - target_i*log(output_i) ]. The derivative of CE-loss is: - target_i/output_i. Since for a target=0 the loss and derivative of the loss is zero regardless of the actual output, it seems like…
0
votes
1 answer

How can I get the right balance between classification loss and a regularizer?

I'm working on a deep learning classifier (Keras and Python) that classifies time series into three categories. The loss function that I'm using is the standard categorical cross-entropy. In addition to this, I also have an attention map which is…
0
votes
1 answer

How is cross entropy calculated for pixel level prediction

I'm running a FCN in Keras that uses the binary cross-entropy as the loss function. However, im not sure how the losses are accumulated. I know that the loss gets applied at the pixel level, but then are the losses for each pixel in the image summed…
Jonathan
  • 1,876
  • 2
  • 20
  • 56
0
votes
0 answers

pytorch subtracting variables with different dimension

conf_loss = cross_entropy_loss(conf_preds.view(-1,num_classes),conf_targets.view(-1)) The shape of x and y are X: torch.Size([69856, 40]) , Y: torch.Size([69856]) respectively. Author has mentioned the size as x:[N,D] and y:[N,] but my y size is…
ninjakx
  • 35
  • 13
0
votes
1 answer

Increasing Cost Value at end of each epoch

I'm relatively new to TensorFlow, and I was trying to play around with the MNIST dataset. This is the code I have, but for some reason the epoch-cost increases with each iteration. I tried changing the learning rate, number of layers, and neurons…
0
votes
2 answers

Simple softmax classifier in tensorflow

So I am trying to write a simple softmax classifier in TensorFlow. Here is the code: # Neural network parameters n_hidden_units = 500 n_classes = 10 # training set placeholders input_X = tf.placeholder(dtype='float32',shape=(None,X_train.shape[1],…
0
votes
1 answer

Periodical pattern in loss function in a convolutional neural network (tensorflow)

I'm working on image segmentation using a Convolutional Neural Network (cnn) implemented in Tensorflow. I have two classes and I am using cross entropy as loss function and as Adam optimizer. I am training the network with around 150 images. During…
0
votes
1 answer

Tensorflow - softmax returning only 0 and 1

I'm training a CNN on tensorflow but I'm having trouble with my loss that is not improving; I've noticed that tf.nn.softmax() is returning a tensor with only 0 and 1 and not a distribution as I'd expect. Here's the repo, I believe that's the reason…
0
votes
0 answers

What is the reason for new function tf.nn.softmax_cross_entropy_with_logits_v2?

Tensorflow has a wonderful function. tf.nn.softmax_cross_entropy_with_logits Later I see another function, tf.nn.softmax_cross_entropy_with_logits_v2 what is the reason for this new function? While using the earlier function Tensorflow…
Maruf
  • 792
  • 12
  • 36
0
votes
1 answer

How to set parameter weights in tf.losses.sigmoid_cross_entropy?

I'm now trying to use tf.losses.sigmoid_cross_entropy on an unbalanced dataset. However, I'm a little confused on the parameter weights. Here are the comments in the documentation: weights: Optional Tensor whose rank is either 0, or the same rank…
Ben
  • 11
  • 3
0
votes
1 answer

how to scale and renormalize the output with tensorflow softmax_cross_entropy_with_logits for class imbalance

I want to scale the model output and renormalize it to deal with the class imbalance issue. For example, if I have 10-labels outputs y_logits and their softmax y_pred and prior p, the new output should be: y_pred /= prior y_pred /= sum(y_pred) The…
Ehab AlBadawy
  • 3,065
  • 4
  • 19
  • 31
0
votes
1 answer

Tensorflow weighted vs sigmoid cross-entropy loss

I am trying to implement multi-label classification using TensorFlow (i.e., each output pattern can have many active units). The problem has imbalanced classes (i.e., much more zeros than ones in the labels distribution, which makes label patterns…
0
votes
1 answer

Tensorflow logits and labels error, but are same shape

this question has been asked several times already, but I don't seem to be able to adapt previous solutions to my code. I would therefore appreciate any advice on how to solve this. I have tried using pdb and set a trace point right before the…
Lena
  • 19
  • 5