Yes, there is a technique called warm_start
which, citing from the documentation, means:
warm_start : bool, default: False
When set to True, reuse the solution of the previous call to fit as initialization, otherwise,
just erase the previous solution. Useless for liblinear solver.
As described in the documentation here, it's available in LogisticRegression
:
sklearn.linear_model.LogisticRegression(..., warm_start=False, n_jobs=1)
So concretely, for your case you would do the following:
from sklearn.linear_model import LogisticRegression
# create an instance of LogisticRegression with warm_start=True
logreg = LogisticRegression(C=1e5, warm_start=True)
# you can access the C parameter's value as follows
logreg.C
# it's set to 100000.0
# ....
# train your model here by calling logreg.fit(..)
# ....
# reset the value of the C parameter as follows
logreg.C = 1e3
logreg.C
# now it's set to 1000.0
# ....
# re-train your model here by calling logreg.fit(..)
# ....
As far as I have been able to check quickly, it's available also in the following: