I have tested the following python script on 2 Windows machines, and onlinegdb's python compiler.
On the Windows machines, this code when run as it is, just exits midway, non-deterministically, with no error message or warning. I tested with python 3.9.6.
It works as expected and does not crash when the function nCr is a different function outside isPrime. You can comment and uncomment the appropriate text.
On onlinegdb's python compiler, it works as expected in both cases.
import sys
sys.setrecursionlimit(10**6)
# def nCr(n, r):
# if r == 0:
# return 1
# else:
# return nCr(n, r-1) * (n-r+1) / r
def isPrime(N):
def nCr(n, r):
if r == 0:
return 1
else:
return nCr(n, r-1) * (n-r+1) / r
if nCr(N, N//2) % N == 0:
return True
else:
return False
for i in range(4050, 4060 + 1):
print(f"{i}: {isPrime(i)}")
else:
print("Done")
Any clues on what may be causing this? Is it possible to get this to run correctly on Windows? Or should I just avoid inner functions entirely?
Note: I know that the prime checker's logic is incorrect.
Note: You can try a larger range by changing the 4th last line if you are not able to reproduce the crash.
Edit 1:
We found that:
- If the recursive depth is sufficiently large, it will most likely cause a crash on all platforms. This number, although large, would still be small enough such that only a small portion of the machines memory is being used.
- Moving the function to module level does not prevent the crash.
- Increasing system recursionlimit does not affect the crash, if it is more than the depth at which the crash occurs.
So, the question now is:
Is there a way to estimate the depth at which the crash will occur? Also, the depth at which the crash occurs is very small, and if we use our own stack instead of calling the function recursively, then we can keep going till the machine is out of memory. So, should we just avoid using recursive function calls in python?