Problem statement:
Let
p(n)
represent the number of different ways in whichn
coins can be separated into piles. For example, five coins can be separated into piles in exactly seven different ways, sop(5)=7
Find the least value of
n
for whichp(n)
is divisible by one million.
So that's a code i got using recursion to solve this problem. I know that's not the optimal aproach, but should give me the right answer... But for some reason I don't understand it gives me back that n = 2301 has a p(n) = 17022871133751703055227888846952967314604032000000, which is divisible by 1,000,000 and is the least n to do that. So why this is not the correct answer? I checked the for n<20 and it matches. So what's wrong with my code?
import numpy as np
import sys
sys.setrecursionlimit(3000)
table = np.zeros((10000,10000))
def partition(sum, largestNumber):
if (largestNumber == 0):
return 0
if (sum == 0):
return 1
if (sum < 0):
return 0
if (table[sum,largestNumber] != 0):
return table[sum,largestNumber]
table[sum,largestNumber] = partition(sum,largestNumber-1) + partition(sum-largestNumber,largestNumber)
return table[sum,largestNumber]
def main():
result = 0
sum = 0
largestNumber = 0
while (result == 0 or result%1000000 != 0):
sum += 1
largestNumber += 1
result = int(partition(sum,largestNumber))
print("n = {}, resultado = {}".format(sum,result))
return 0
if __name__ == '__main__':
main()