2

I'm learning ML by myself, and I have an error when I try to code Logistic Regression in python.This is from Standford online course. I've tried many times, including change grad to grad.ravel()/grad.fatten(), but none of them worked.

Input:

import numpy as np

data=np.loadtxt(r'E:\ML\machine-learning-ex2\ex2\ex2data1.txt',delimiter=',')

X=data[:,:2]
y=data[:,2].reshape(-1,1)

def sigmoid(z):
    return 1/(np.exp(-1*z)+1)

def costFunction(theta,X,y):
    m=len(y)
    h=sigmoid(np.dot(X,theta))
    J=-1/m*np.sum((np.dot(y.T,np.log(h))+np.dot((1-y).T,np.log(1-h))))
    grad=1/m*np.dot(X.T,(h-y))
    return J,grad

m,n=np.shape(X)
X=np.hstack((np.ones([m,1]),X))
initial_theta=np.zeros([n+1,1])

import scipy.optimize as opt
result = opt.fmin_tnc(func=costFunction, x0=initial_theta, args=(X, y))

Output:

    ValueError:
    ---> 25 result = opt.fmin_tnc(func=costFunction, x0=initial_theta, args=(X, y))

    ValueError: tnc: invalid gradient vector from minimized function.
Axiumin_
  • 2,107
  • 2
  • 15
  • 24
Scarlett
  • 21
  • 4

2 Answers2

0

Well, I forgot to copy these codes: m,n=np.shape(X) initial_theta=np.zeros(n+1) And I've got the answer. x0 parameter need to be a 1D array, but I gave it a 2D array. So just change initial_theta to 1D and reshape it to 2D in costFunction

Scarlett
  • 21
  • 4
0

I got same error. Solved it by adding a parameter approx_grad = True in fmin_tnc function (refer: https://towardsdatascience.com/building-a-logistic-regression-in-python-301d27367c24)

def sigmoid(z):
    z = z.astype(float)
    return (1 / (1 + np.exp(-z)))

def net_input(theta, x):
    return np.dot(x, theta)

def probability(theta, x):
    return sigmoid(net_input(theta, x))

def cost_function(theta, x, y):
    m = x.shape[0]
    total_cost = -(1 / m) * np.sum(y * np.log(probability(theta, x)) + (1 - y) * np.log(1 - probability(theta, x)))
    return total_cost

def gradient(theta, x, y):
    m = x.shape[0]
    return (1 / m) * np.dot(x.T, sigmoid(net_input(theta,   x)) - y)

def fit(x, y, theta):
    opt_weights = fmin_tnc(func=cost_function, x0=theta, fprime=gradient, approx_grad=True, args=(x, y.flatten()))
    return opt_weights[0]

X = np.c_[np.ones((X.shape[0], 1)), X]
y = loan_data_new.iloc[:, -1].to_numpy()
y = y[:, np.newaxis]
m,n = np.shape(X)
one_vec = np.ones((m,1))
X = np.hstack((one_vec,X))
theta = np.zeros((n+1,1))
theta = theta[:,np.newaxis] 
parameters = fit(X, y, theta)
Manav Patadia
  • 848
  • 7
  • 12