In machine learning and information theory, the cross entropy is a measure of distance (inverse similarity) between two probability distributions over the same underlying set of events. Cross entropy is the common choice of the loss function in neural networks for classification tasks.
Questions tagged [cross-entropy]
360 questions
2
votes
1 answer
How can I apply different weights for my loss funciton based on the ones coming from my train_dataloader method in Pytorch Lightning?
So basically, I am using the class from the Pytorch Lightning Module. My issue is that I'm loading my data using Pytorch Dataloader:
def train_dataloader(self):
train_dir = f"{self.img_dir_gender}/train"
# train_transforms: from PIL to…

Quentin Bracq
- 79
- 4
2
votes
1 answer
cross entropy loss with weight manual calculation
Hi just playing around with code, I got the unexpected result of cross entropy loss weight…

won5830
- 512
- 5
- 13
2
votes
1 answer
Pytorch - RuntimeError: Expected object of scalar type Long but got scalar type Float for argument #2 'target' in call to _thnn_nll_loss_forward
I was trying & experimenting something with PyTorch, where I created my own inputs & targets. I fed these inputs to the model (which is a basic ANN with 2 hidden layers, nothing wrong with that). But for some reason I am not being able to calculate…

imharjyotbagga
- 199
- 4
- 17
2
votes
0 answers
PyTorch ValueError: Expected target size (2, 33), got torch.Size([2, 73])
I'm quite new to Pytorch.
I want to compute the loss of a batch in a Transformer. In this case my 'batch' has only two replicas.
The batch outputs a Tensor of the shape 2,73,33:
output
tensor([[[ 21.1355, -7.5047, 2.8138, ..., -14.1462,…

katze
- 43
- 7
2
votes
1 answer
Pytorch lightning metrics: ValueError: preds and target must have same number of dimensions, or one additional dimension for preds
Googling this gets you no where, so I decided to help future me and others by posting this as a searchable question.
def __init__():
...
self.val_acc = pl.metrics.Accuracy()
def validation_step(self, batch, batch_index):
...
…

Gulzar
- 23,452
- 27
- 113
- 201
2
votes
1 answer
Using categorical_crossentropy for a sequence of images
I have a model that accepts a sequence of images as an input (None, n_step, 128, 128) (instead of a single image) where n_step is a fixed number 10. And I am using categorical_crossentropy for classification of four class problem. But I have an…

iamkk
- 135
- 1
- 16
2
votes
2 answers
using class_weight for imbalancing data-.fit_generator()
I have imbalanced dataset with 2 classes. I am using categorical_crossentropy. I wonder about my code. Is it correct to use class_weight with categorical_crossentropy ?? If yes , does the class_weight applied only to training set or to the whole…

Edayildiz
- 545
- 7
- 16
2
votes
1 answer
How to write a custom loss function in LGBM?
I have a binary cross-entropy implementation in Keras. I would like to implement the same one in LGBM as a custom loss. Now I understand LGBM of course has 'binary' objective built-in but I would like to implement this one custom-made on my own as a…

Milind Dalvi
- 826
- 2
- 11
- 20
2
votes
1 answer
Using sigmoid output for cross entropy loss on Pytorch
I’m trying to modify Yolo v1 to work with my task which each object has only 1 class. (e.g: an obj cannot be both cat and dog)
Due to the architecture (other outputs like localization prediction must be used regression) so sigmoid was applied to the…

StopLimitMePlease
- 21
- 2
2
votes
0 answers
Query about Loss Functions for LSTM models (Binary Classification)
I'm working on building an LSTM model to binary classify price movements.
My training data is data I simulated, it's a 2,000 rows * 3,780 columns dataframe of price movements.
I have a separate labels file that classifies price movements as either 1…

Ian Murray
- 87
- 8
2
votes
2 answers
Meaning of sparse in "sparse cross entropy loss"?
I read from the documentation:
tf.keras.losses.SparseCategoricalCrossentropy(
from_logits=False, reduction="auto", name="sparse_categorical_crossentropy"
)
Computes the crossentropy loss between the labels and predictions.
Use this…

Josh
- 11,979
- 17
- 60
- 96
2
votes
1 answer
Cross Entropy Calculation in PyTorch tutorial
I'm reading the Pytorch tutorial of a multi-class classification problem. And I find the behavior of Loss calculation in Pytorch confuses me a lot. Can you help me with this?
The model used for classification goes like this:
class Net(nn.Module):
…

Nick Nick Nick
- 159
- 3
- 13
2
votes
0 answers
Acceptable range of Categorical Cross-Entropy loss function
I am making a LSTM network where output is in the form of One-hot encoded directions Left, Right, Up and Down. Which comes out to be like:
[0. 0. 1. 0.]
[1. 0. 0. 0.]
[0. 0. 1. 0.]
...
[0. 0. 1. 0.]
[0. 0. 1. 0.]
[0. 0. 0. 0.]
What should be…

Mohak Shukla
- 77
- 1
- 7
2
votes
3 answers
AttributeError: module 'tensorflow_core.python.keras.api._v2.keras.losses' has no attribute 'softmax_cross_entropy'
I have an AttributeError: module 'tensorflow_core.python.keras.api._v2.keras.losses' has no attribute 'softmax_cross_entropy' error when using tf.losses.softmax_cross_entropy. Could someone help me?
user12682643
2
votes
1 answer
Cross-entropy loss influence over F-score
I'm training an FCN (Fully Convolutional Network) and using "Sigmoid Cross Entropy" as a loss function.
my measurements are F-measure and MAE.
The Train/Dev Loss w.r.t #iteration graph is something like the below:
Although Dev loss has a slight…

hashuy
- 23
- 5