1

I was trying to build a NN on python to solve regression problem with inputs X (a,b) and output Y(c). Using leaky Relu as an activation function for hidden layer and linear function for the output layer. After 3-4 iterations nn seems to get burst with extremely larges/small numbers and results in NaN. The derivatives I have used are below. Maybe someone can help me - is the problems with my math or I should do more work to normalize X and Y prior to nn ?

    dW2 = -2*(np.dot(dZ2,A1.transpose()))/m
    db2 = -2*(np.sum(dZ2, axis = 1, keepdims = True))/m
    drel = lrelu(Z1)
    dZ1 = (np.dot(W2.transpose(),dZ2))*(drel)
    dW1 = (np.dot(dZ1,X.transpose()))/m
    db1 = (np.sum(dZ1, axis = 1, keepdims = True))/m

Where

    Z1 = np.dot(W1,X)+b1
    A1 = np.where(Z1 > 0, Z1, Z1 * 0.01)
    Z2 = np.dot(W2,A1)+b2
    A2 = Z2*1
    cost = np.sum(np.square(Y-A2))/m

And Relu derivative:

def lrelu(rel):
    alpha = 0.01
    drel = np.ones_like(rel)
    drel[rel < 0] = alpha
    return drel

Thanks

joskiy18
  • 11
  • 2

1 Answers1

0

Already have solved the problem by preprocessing the data.

joskiy18
  • 11
  • 2