0

I am trying to implement logistic regression using scipy.optimize in python. When I tried using sklearn, I came to know about the beta values that can be used to predict for the given dataset. But when I try to use it in scipy optimize function, the cost value is going to infinity or sometimes the values are fine but the intercept is not there. The reason being, the bounds and initial guess it seems. How can it be formulated ? If there is any other method, Please let me know.

import scipy 
import numpy as np 


def sigmoid(x,beta):
    return((np.exp(np.dot(x, beta))/(1.0 + np.exp(np.dot(x, beta)))))

def mle(beta, x, y):
    return(-(np.sum(y*np.log(sigmoid(x, beta)) + (1-y)*(np.log(1-sigmoid(x,beta))))))

# x0 value ? and bounds ? 
result = scipy.optimize.minimize(mle, x0 = np.array([-.1]), args = (x,y))

There is no intercept in this. The shape of dataframe is (100,2). Please advice

SuperKogito
  • 2,998
  • 3
  • 16
  • 37
python_interest
  • 874
  • 1
  • 9
  • 27

1 Answers1

0

I have just added x = np.hstack((np.ones((x.shape[0], 1)), x)) , and this solved the problem.

python_interest
  • 874
  • 1
  • 9
  • 27