I find myself reshaping 1D vectors way to many times. I wonder if this is because I'm doing something wrong, or because it is an inherit fault of numpy.
Why can't numpy infer that when he gets an object of shape (400,) to transform it to (400,1) ? And why do so many numpy operations result in removing the axis completely?
e.g.
def predict(Theta1, Theta2, X):
m = X.shape[0]
X = np.c_[np.ones(m), X]
hidden = sigmoid(X @ Theta1.T)
hidden = np.c_[np.ones(m), hidden]
output = sigmoid(hidden @ Theta2.T)
result = np.argmax(output, axis=1) + 1 # removes the 2nd axis - (400,)
return result.reshape((-1, 1)) # re-add the axis - (400,1)
pred = predict(Theta1, Theta2, X)
print(np.mean(pred == y))
If I don't reshape the result in the last row, I get funky behavior when comparing pred (400,) and y (400,1).