Questions tagged [cross-entropy]

In machine learning and information theory, the cross entropy is a measure of distance (inverse similarity) between two probability distributions over the same underlying set of events. Cross entropy is the common choice of the loss function in neural networks for classification tasks.

360 questions
0
votes
1 answer

Why does my cross-entropy loss function get huge if I use a network of many relus?

I have this loss function: loss_main = tf.reduce_mean( tf.nn.softmax_cross_entropy_with_logits(train_logits, train['labels']), name='loss_main', ) train_logits is defined from a pipeline built…
Claudiu
  • 224,032
  • 165
  • 485
  • 680
0
votes
1 answer

Delta component doesnt show in weight learning rule of sigmoid activation MLP

As a basic proof of concept, in a network that classifies K classes with input x, bias b, output y,S samples, weights v and t teacher signal in which t(k) equals 1 if the matching sample is under k class. Variables Let x_(is) represent the i_(th)…
0
votes
1 answer

Binary-CrossEntropy - Works on Keras But Not on Lasagne?

Im using the same convolutional neural network structure on Keras and Lasagne. Right now, i just changed to a simple network to see if it changed anything, but it didnt. On Keras it works fine, it outputs values between 0 and 1 with a good…
KenobiBastila
  • 539
  • 4
  • 16
  • 52
-1
votes
0 answers

Cross-Entropy Loss stuck at local minima 2.3, accuracy stuck at 10%

I'm having problems with a multiclassification task, it is a feedforward neural network which should classify roghly 60.000 EEG input data in 10 classes. I have tried different neural network architecture but the loss always gets stuck at 2.3 and…
-1
votes
1 answer

RuntimeError: expected scalar type Float but found Double error torch.nn.CrossEntropyLoss Pytorch

I am trying to train a pytorch model. The loss function is: cn_loss = torch.nn.CrossEntropyLoss(weight=train_label_weight, reduction='mean') Code fragment from the training function: for sents, targets in batch_iter(df_train,…
nr spider
  • 134
  • 1
  • 12
-1
votes
1 answer

Is this formula for weighted binary cross entropy loss correct?

Here it is the formula I built: My equation for weighted binary cross entropy loss Where: alpha(i) = #Negative samples / Total samples if y(i)=1 alpha(i) = #Positive samples / Total samples if y(i)=0 Is it correct? I cannot find any similar formula…
inginging
  • 1
  • 1
  • 3
-1
votes
1 answer

ValueError: math domain error | log() in Python

I'm new to using the math library in Python. The purpose of this script is to show my "working out" of Cross-Entropy error function. I have checked my parentheses and operators and cannot see anything wrong, with my noob eyes. The error occurs on…
-1
votes
1 answer

How to implement getting loss with pytorch on NLP?

I'm studying about NLP with just simple toy project(just gernerating text) with pytorch. While i'm referencing some example code on online, got a problem i can't understand. Here are codes (some codes has been omitted and are not completed…
GE LO
  • 107
  • 1
  • 4
-1
votes
1 answer

Tensorflow image classification binary crossentropy loss is negative

I'm new to Tensorflow. I followed some tutorials with a provided dataset and wanted to try something on my own. I decided I'd try to classify Magic the Gathering sets. Each card has a symbol in different colors on it: Black, Gold and so on. The…
-1
votes
2 answers

What exactly is label for image segmentation task in computer vision

I have been working on a some image segmentation tasks lately and would like to apply one from scratch. Segmentation as I have understood is the per pixel prediction to where it belongs - to an object instance(things), to a background segment…
-1
votes
1 answer

Weighted cross-entropy tensorflow

I couldn't find a tensorflow built-in that allows you to pass in labels which don't sum to 1, so tried writing my own: (Input is [batch_size,labels]) tf.reduce_mean(tf.reduce_sum(y_true,axis=1) * tf.reduce_logsumexp(y_pred_logits,axis=1) -…
Akababa
  • 327
  • 3
  • 21
-1
votes
1 answer

Information Theoretic Measure: Entropy Calculation

I have a corpus consisting of thousands of lines. For the sake of simplicity, lets consider the corpus to be: Today is a good day I hope the day is good today It's going to rain today Today I have to study How do I calculate the entropy using the…
RDM
  • 1,136
  • 3
  • 28
  • 50
-1
votes
1 answer

Tensorflow dynamic_rnn training loss decreasing, validation loss increasing

I am adding my RNN text classification model. I am using last state to classify text. Dataset is small I am using glove vector for embedding. def rnn_inputs(FLAGS, input_data): with tf.variable_scope('rnn_inputs', reuse=True): W_input =…
Wazy
  • 8,822
  • 10
  • 53
  • 98
-2
votes
1 answer

Is this formula of cross entropy for a single example i with C classes correctly written?

If y is the label and hat y is my prediction, would the following formula for cross-entropy with the number of C possible classes be right: In the case of a Binary Cross Entropy, can I just remove the sum over C or say C=1? For calculating the loss…
-2
votes
1 answer

Dimension of gradients in backpropagation

Take a simple neural network that takes in data of dimension NxF, and output NxC where the N, F, and C represent number of samples, features, and C output neurons respectively. Needless to say, softmax function with cross-entropy is used given we…
VM_AI
  • 1,132
  • 4
  • 13
  • 25
1 2 3
23
24