Questions tagged [loss]

599 questions
0
votes
1 answer

softmax cross entropy return value

What does it mean if this is the return value for tf.losses.softmax_cross_entropy_loss? Does the fact that is states value:0 mean and shape=() mean that nothing was computed?
haxtar
  • 1,962
  • 3
  • 20
  • 44
0
votes
1 answer

Loss function representing the euclidean distance from prediction to nearest groundtruth in images?

Is there a loss function that calculates the euclidean distance between a prediction pixel and the nearest groundtruth pixel? Specifically, this is the location distance, not the intensity distance. This would be on binary predictions and binary…
user135237
  • 389
  • 1
  • 3
  • 12
0
votes
0 answers

the loss doesn't change for the network with a linear layer after softmax

# Inputs and Placeholders x = tf.placeholder(tf.float32, shape=[None, 30]) y_ = tf.placeholder(tf.float32) # Inference W_1 = tf.Variable(tf.zeros([30,50])) b_1 = tf.Variable(tf.zeros([50])) layer_1 = tf.sigmoid(tf.add(tf.matmul(x, W_1), b_1)) W_2…
Luca
  • 59
  • 4
0
votes
1 answer

Modifying Torch criterion

I want to create a custom loss function in Torch which is a modification of ClassNLLCriterion. Concretely, ClassNLLCriterion loss is: loss(x, class) = -x[class] I want to modify this to be: loss(x, class) = -x[class]*K where K is a function of the…
braindead
  • 97
  • 2
  • 7
0
votes
0 answers

custom loss function different than default

I am trying to understand how to build a custom loss function and the first thing I've tried is to reimplement the binary_crossentropy function in keras. In my code if I do: model.compile(Adam(lr=learning_rate), loss=losses.binary_crossentropy,…
Angel Lopez
  • 29
  • 1
  • 4
0
votes
1 answer

Modifying ClassNLLCriterion in Torch

I am new to Torch and I want to create a custom loss function in Torch which is a modification of ClassNLLCriterion. Concretely, ClassNLLCriterion loss is: loss(x, class) = -x[class] I want to modify this to be: loss(x, class) = -x[class] +…
braindead
  • 97
  • 2
  • 7
0
votes
1 answer

Recovering a checkpoint after reaching NaN loss?

I'm training an RNN and sometime overnight the loss function reached NaN. I've been reading that a solution to this is to decrease the learning rate. When attempting to restart training from the (only) checkpoint I have and using a smaller learning…
hate5six
  • 11
  • 3
0
votes
1 answer

why is my keras custom metric not working?

Why does this code work fine for the loss function but the metrics fail after one iteration with "ValueError: operands could not be broadcast together with shapes (32,) (24,) (32,)"? If I use "categorical_crossentropy" in quotes then it works. And…
simon
  • 2,561
  • 16
  • 26
0
votes
1 answer

Multiple executions of the evaluation gives different losses in TensorFlow

I'm getting started with TensorFlow. https://www.tensorflow.org/get_started/ While I was evaluating multiple times seeing how to feed the data, I found that the loss changes with executions. eval_input_fn =…
0
votes
1 answer

Torch, how to get a tensor of loss values during batch optimization

I am training a network with batch optimization over my training set, and I would like to get a loss vector containing the loss of each of my training examples. More specifically I am using images (of size 3x64x64) in a batch of size 64. Therefore…
fonfonx
  • 1,475
  • 21
  • 30
0
votes
1 answer

caffe output the negative loss value with SoftmaxWithLoss layer?

Below is my last layer in training net: layer { name: "loss" type: "SoftmaxWithLoss" bottom: "final" bottom: "label" top: "loss" loss_param { ignore_label: 255 normalization: VALID } } Note I adopt a softmax_loss layer. Since…
huangh12
  • 11
  • 3
0
votes
1 answer

Accuracy goes down during epoch keras

I've tried to write a neural network but the accuracy doesn't change each epoch. I'm using keras and I can watch the accuracy change as each epoch is evaluated per se and it will start low, go up a bit, then drop back down to the exact same value…
0
votes
1 answer

Amibroker: Daily Loss Limit

I want to implement an afl code to find Daily Loss Limit in intraday trading. I will use the code for backtesting around 200 days. I have the following code but it is with mistakes. // identify new day dn = DateNum(); newDay = dn != Ref( dn,-1); //…
0
votes
0 answers

caffe training loss does not converge

I'm getting the problem of non-converged training loss. (batchsize: 16, average loss:10). I have tried with the following methods + Vary the learning rate lr (initial lr = 0.002 cause very high loss, around e+10). Then with lr = e-6, the loss…
0
votes
1 answer

Loss functions in MATLAB

I want to know how to interpret the loss functions results in MATLAB? On other words, for example if I got 0.3247 as a results of kfoldLoss() function, is this mean that it is 32.47% error or it is a 0.3247%, or how correctly can I define/interpret…