1

For my deep learning project I need a cifar10 data loader (python) for my model which uses a varied batch size each iteration, unfortunately torch.utils.data.DataLoader uses a constant batch size throughout the training.

The batch size is randomly geometrically distributed for each iteration.

If any of you have a code for a data loader (especially for cifar10) which uses a varied batch sampler size I would love to have a look.

I tried using the code of chatGPT by building a batch sampler class but it didn't work either.

As I tried to build a class that is based on torch.utils.data.DataLoader, but I got a couple of errors

Thanks in advance

marc_s
  • 732,580
  • 175
  • 1,330
  • 1,459

0 Answers0