1

I was trying to code SGD for L2 Log Regression in Python. But my avg loss is remaining almost the same for every epoch. Can some one help me out with the code. Code: Function to predict the Y

def predict(row, coefficients):
    yhat = coefficients[0]
    for i in range(len(row)-1):
        yhat += coefficients[i + 1] * row[i]
    return 1.0 / (1.0 + exp(-yhat))

Function to calculate the loss

def loss_func(w,x_i,lam):
    y=x_i['y']
    yhat=predict(x_i[:-1],w)
    loss=(y*np.log(yhat))-((1-y)*np.log(1-yhat))+(lam*np.linalg.norm(w)/2)
    return loss

Function to update weights

def weights(w,x,lrate,n_epoch,lam):
    total_loss_lst=[0,1]
    for epoch in range(n_epoch):
        sum_error=0
        total_loss=0
        for k in range(len(x)):
            x_i=x.iloc[k]
            total_loss+=loss_func(w,x_i,lam)

        each_row=x.iloc[np.random.randint(len(x))]
        y_pred=predict(each_row[:-1],w)
        error=y_pred-each_row['y']
        w[0]=w[0]+(lrate*error)
        for i in range(0,len(each_row)-1):
            #print(each_row[i])
            w[i+1]=w[i+1]-(lrate*error*y_pred*(1-y_pred)*each_row[i])
            #w[i+1]=w[i+1]+(lrate*error*each_row[i])
        total_loss_lst.append(total_loss/len(x))
        print('>epoch=%d, lrate=%.3f, error=%.3f' % (epoch, lrate, total_loss/len(x)))
    return w

Initiating the function w vector is all zero at first instance.

weights(w,x,.0001,10,.0001) 

I am getting the below output.

>epoch=0, lrate=0.000, error=0.274
>epoch=1, lrate=0.000, error=0.274
>epoch=2, lrate=0.000, error=0.274
>epoch=3, lrate=0.000, error=0.274
>epoch=4, lrate=0.000, error=0.274
>epoch=5, lrate=0.000, error=0.275
>epoch=6, lrate=0.000, error=0.275
>epoch=7, lrate=0.000, error=0.275
>epoch=8, lrate=0.000, error=0.275
>epoch=9, lrate=0.000, error=0.275

The weights are getting updated but the loss is very similar. But the sklearn's loss output is ranging from .45 to .37.

Thank you

0 Answers0