When using partial_fit at Scikit SGDClassifier the number of iteration for the convergence of the cost functions equals 1, as stated in the description:
Perform one epoch of stochastic gradient descent on given samples.
Internally, this method uses max_iter = 1. Therefore, it is not guaranteed that a minimum of the cost function is reached after calling it once. Matters such as objective convergence and early stopping should be handled by the user.
How can I increase max_iter such that my cost function is optimized properly and not just with one iteration? Or related to the scikit- description, how can I handle “objective convergence” and “early stopping” to my classifier using partial_fit?