I am creating a simple neural network that uses an image as it's input. It's all done except that I have to minimize my cost function, but when I run the scipy.optimize.minimize, it just sits there. I'd assume that it shouldn't take long as the amount of data I'm passing through isn't very large. My question is why won't it terminate successfully and why it doesn't throw up a an error instead. It is also using almost all of my cpu usage after running the program, but hours later it still won't terminate.
import numpy
input_layer_size = 625
hidden_layer_size = 40
num_labels = 1
Theta1 = randInitializeWeights(input_layer_size, hidden_layer_size)
Theta2 = randInitializeWeights(hidden_layer_size, num_labels)
nn_params = list(Theta1.flat) + list(Theta2.flat)
numpy.asarray(nn_params)
def CostFunction(nn_params) # as well as other values(args)
#Cost Function that unrolls nn_params and return an int (J)
from scipy import optimize
res = scipy.optimize.minimize(costFunction,x0 = nn_params,args = (input_layer_size, hidden_layer_size,num_labels,X,y,lambd),options = {'maxiter':50,'disp':True})
print(res)