I am trying to program incremental stochastic gradient descent (ISGD) algorithm in logistic regression. Initially, I coded respective logistic regression' loss function and its gradient, also got some idea to proceed rest of workflow. But, I have no idea how to apply sequential operation in incremental stochastic gradient descent algorithm which can be used in the respective logistic regression. How can I implement the sequential operation in incremental SGD? Any way to make this happen in Python? How can I do that? Any idea?
Objective logistic regression's loss function and gradient
My initial implementation
import numpy as np
import scipy as sp
import sklearn as sl
from scipy import special as ss
# implementation of logisitic regression loss function
def lossFunc(X,y,w):
w.resize((w.shape[0],1))
y.resize((y.shape[0],1))
lossFnc=ss.log1p(1+np.nan_to_num(ss.expm1(-y* np.dot(X,w,))))
rslt=np.float(lossFnc)
return rslt
# implementation of its gradient function
def gradFnc(X,y,w):
w.resize((w.shape[0],1))
y.resize((y.shape[0],1))
gradF1=-y*np.nan_to_num(ss.expm1(-y))
gradF2=gradF1/(1+np.nan_to_num(ss.expm1(-y*np.dot(X,w))))
gradF3=gradF2.resize(gradF2.shape[0],)
return gradF3
def _init_(self, learnRate=0.0001, num_iter=100, verbose=False):
self.w=None
self.learnRate=learnRate
self.verbose=verbose
self.num_iter=num_iter
def fitt(self, X,y):
n,d=X.shape
self.w=np.zeros(shape=(d,))
for i in range(self.num_iter):
print ("\n:", "Iteration:", i)
grd=gradFnc(self.w, X,y)
grd.resize((grd.shape[0],1))
self.w=self.w-grd
print "Loss:", lossFunc(self.w,X,y)
return self
Now I have no idea how to apply sequential operation in incremental SGD which can be used in the respective logistic regression. How can I make this happen? Is there any efficient workaround to implement sequential incremental SGD algorithm for logistic regression? Any better idea? Thanks
The particular interpretation of incremental SGD can be found here: hogwild! algorithm for logistic regression.
What is the efficient programming pipeline to accomplish the task that I stated above? Any idea?