Questions tagged [cross-entropy]

In machine learning and information theory, the cross entropy is a measure of distance (inverse similarity) between two probability distributions over the same underlying set of events. Cross entropy is the common choice of the loss function in neural networks for classification tasks.

360 questions
0
votes
0 answers

Im trying to fit the model using the fit function but its causing None Type Attribute error

Here is the section of code #Compililng the model opt = Adam(lr = INIT_LR, decay = INIT_LR/ EPOCHS) model = model.compile(loss = "binary_crossentropy", optimizer = opt, metrics = ["accuracy"]) #training the head network H =…
0
votes
1 answer

Multi-class segmentation in Keras

I'm trying to implement a multi-class segmentation in Keras: input image is grayscale (i.e 1 channel) ground truth image has 3 channels, each pixel is a one-hot vector of length 3 prediction is standard U-Net trained with categorical_crossentropy…
sirfoga
  • 121
  • 4
0
votes
0 answers

Torch self implementation of Neural Network

I made this Neural Network but something is wrong with my backward derivatives which cause the wights to be 'nan' and the prediction to be very low. I used Softmax and Cross entropy loss. class Neural_Network: def __init__(self, input_size,…
jenny
  • 105
  • 5
0
votes
1 answer

Is there any direct relation in between accuracy and loss while performing text classification using neural network?

I am trying to perform multi-class text classification using the deep recurrent neural network. My network is incurring a huge loss of 94%, 80% and sometimes 100% with certain accuracy. It is surprising that with 64% validation accuracy the incurred…
0
votes
3 answers

How to create weighted cross entropy loss?

I have to deal with highly unbalanced data. As I understand, I need to use weighted cross entropy loss. I tried this: import tensorflow as tf weights = np.array([]) def loss(y_true, y_pred): # weights.shape = (63,) # y_true.shape =…
Ivan Adanenko
  • 445
  • 2
  • 6
  • 18
0
votes
2 answers

How can I create a custom loss function in keras ? (Custom Weighted Binary Cross Entropy)

I'm creating a fully convolutional neural network, which given an image in input is capable to identify zones in it (black, 0), and also identify background (white, 255). My targets are binarized images (ranging 0-255), and I'd like to get some…
Durand
  • 67
  • 6
0
votes
1 answer

to find the cross enthropy with tensorflow

predicted_scores = tf.constant([ [0.32,0.2,0.5,0.3,0.2,0.5,0.3,0.2,0.5,0.3,0.2,0.5,0.3,0.2,0.5,0.3,0.2,0.5], [0.31,0.2,0.5,0.3,0.2,0.5,0.3,0.2,0.5,0.3,0.2,0.5,0.3,0.2,0.5,0.3,0.2,0.5], …
user6507246
  • 91
  • 1
  • 2
0
votes
3 answers

Sparse categorical entropy loss becomes NaN without label encoding

I'm building a classifier for predicting labels -1 and 1. When I encode the labels by one hot encoder and use categorical cross entropy, I don't have any problems with learning. model1.add(Dense(2,…
Hishi51
  • 55
  • 3
  • 9
0
votes
2 answers

Why my deep nerual net descends slowly with softmax in the fully connected layer instead of without softmax in the fully connected layer?

I am just building a deep nerual networks, and I find my network converge faster when there is no activation function (softmax) in fully connected layer. But when I add this softmax function, the convergence is really bad and even stop in really…
0
votes
1 answer

Truly understanding Cross-Entropy-Loss

I have a machine learning course, where I have to implement the forward and backward method of the CELoss: class CELoss(object): @staticmethod def forward(x, y): assert len(x.shape) == 2 # x is batch of predictions (batch_size,…
0
votes
0 answers

AttributeError when trying to calculate Spare Categorical Crossentropy loss on predictions

I'm building a deep learning model in python with keras with multiple inputs, but only one categorical output. The data is suffering class imbalance so I need to use integer coded output categories instead of one-hot encoded output variables to be…
0
votes
1 answer

Softmax Cross Entropy implementation in Tensorflow Github Source Code

I am trying to implement a Softmax Cross-Entropy loss in python. So, I was looking at the implementation of Softmax Cross-Entropy loss in the GitHub Tensorflow repository. I am trying to understand it but I run into a loop of three functions and I…
0
votes
1 answer

How to sum up and interpret epoch loss while using binary crossentropy?

For educational purposes I've been creating deep learning library for some time now. Few days ago I received a task for intern position to create a model from scratch using numpy which will classify digits from subset of MNIST dataset into 2 classes…
Bearnardd
  • 37
  • 5
0
votes
1 answer

Pytorch cross entropy input dimensions

I'm trying to develop a binary classifier with Huggingface's BertModel and Pytorch. The classifier module is something like this: class SSTClassifierModel(nn.Module): def __init__(self, num_classes = 2, hidden_size = 768): …
0
votes
1 answer

I am confused to make mlmodel updatable using coreml3 tools

I have a regressor mlmodel trained using mobilenetv2。The last several layers are as follows: I wanna to make this mlmodel to a updatable mlmodel and train the innerProduct layer (fully-connected layer in pytorch). I have converted the mlmodel…