1

Using trainingData, which is a torch.utils.data.Dataset, throws an error about inconsistent lengths:


model = NeuralNetClassifier(
    module=NN,
    criterion=T.nn.BCELoss,
    # optimizer=T.optim.Adam,
    train_split=predefined_split(criticalData),
    module__input_size=nSize*nSize,
    module__hidden_size=100,
    module__output_size=2
)

param_grid = {
    'optimizer': [T.optim.Adam, T.optim.SGD],
    'optimizer__lr': [0.01, 0.001, 0.0001],
    'optimizer__weight_decay': [0.0, 0.0001, 0.001, 0.01]
}

grid = GridSearchCV(model, param_grid, cv=None, scoring='accuracy', verbose=1, n_jobs=-1)
grid_result = grid.fit(trainingData, yTrain)
ValueError: Dataset does not have consistent lengths.

I am trying to use GridSearchCV using a custom torch.utils.data.IterableDataset but GridSearchCV seems to not support this. I rewrote this dataset instead as an torch.utils.data.Dataset but this fails with an inconsistent lengths error. This is after also supplying a set of target output values to fit. The error occurs even though I have implemented a working __len__ method and the dataset works perfectly fine with pytorch. Slicing the dataset using SliceDataset is not possible due to the sizes of the data involved

0 Answers0