- Is there a method in "torch.nn.Functional" which can allow me to get the Cross Entropy Loss for a Multi-class Classification with Integer Labels (1 integer label/class per instance)?
- Should I convert both lists into a FloatTensor?
import torch
import torch.nn.functional as F
Right now, my
predictions = [1, 2, 2, 5, 3]
actual_targets = [1, 2, 6, 5, 7]
have integers as elements
Then I convert from list to tensors:
predictions = torch.tensor(predictions)
actual_targets = torch.tensor(actual_targets)
I have tried some other posts which mentioned converting tensors to 'Long' if we have integer labels, but I encountered the same error message.
predictions = torch.LongTensor(predictions)
predictions.type(torch.LongTensor)
actual_targets = torch.LongTensor(actual_targets)
actual_targets.type(torch.LongTensor)
F.cross_entropy(predictions, targets)
Error message:
---> 13 F.cross_entropy(predictions, targets)
File /opt/conda/lib/python3.10/site-packages/torch/nn/functional.py:3029, in cross_entropy(input, target, weight, size_average, ignore_index, reduce, reduction, label_smoothing)
3027 if size_average is not None or reduce is not None:
3028 reduction = _Reduction.legacy_get_string(size_average, reduce)
-> 3029 return torch._C._nn.cross_entropy_loss(input, target, weight, _Reduction.get_enum(reduction), ignore_index, label_smoothing)
RuntimeError: Expected floating point type for target with class probabilities, got Long