I am trying to train a model using optuna for hyperparameter optimization. Now in my train function, I am passing all all the train images in the dataset to that model in batches of 4.
Say I have 20 images so that means 20/4 = 5 batches of my dataset are being passed to my model. I have not added the concept of epochs.
Now I integrate optuna into my code to find the best learning rate and optimizer and I get the output for different trials of optuna.
Now I want to understand that does one trial mean one epoch since one trial has gone over my entire dataset in batches? Or do trials work differently from epochs and I will have to add code to introduce epochs into my train function?