I am trying to estimate the mean value of log(det(AAT)+1) in Python. My simple code works fine until I get to 17×17 matrices at which point it gives me a math error. Here is the code:
iter = 10000
for n in xrange(1,20):
h = n
dets = []
for _ in xrange(iter):
A = (np.random.randint(2, size=(h,n)))*2-1
detA_Atranspose = np.linalg.det(np.dot(A, A.transpose()))
try:
logdetA_Atranspose = math.log(detA_Atranspose+1,2)
except ValueError:
print "Ooops!", n,detA_Atranspose
dets.append(logdetA_Atranspose)
print np.mean(dets)
A is supposed to be a matrix with elements that are either -1 or 1.
What am I doing wrong and how can it be fixed? What is special about 17?