In k-fold cross validation why we need to reset the weights after each fold we use thia function
def reset_weights(m): if isinstance(m, nn.Conv2d) or isinstance(m, nn.Linear): m.reset_parameters() so we reset the weights of the model so that each cross-validation fold starts from some random initial state and not learning from the previous folds.
Why i that important ? and i think if we don't do that it would be better that the model learn from all folds and update its parameter from all of them not every one on its own.