I am programming a custom optimizer now, and the legth of bias
in first dimension is not certain because last batch have no enough data to bulid a batch. So the initialization of weights with fixed batch_size do not satisfy for torch.add
between the last batch and the fixed length weights.
bias = torch.randn(batch_size,units)
batch_data = generator(path)
# for example
weights.shape # is (128,256)
# but the last batch has only 50 samples.
out = sigmoid(x*weights+bias) # where the length of first dimension is not mathed.
So, I wonder whether I can create a tensor where the lenght of some dimension could be variable, like variable length list.