To me:
Magic Number: If sum of digits of a number becomes 1 until length of resultant becomes 1 is called magic number.
example: 1 [ sum: 1], 10 [sum: 1 + 0 = 1], 91 [sum: pass1: 9 + 1 = 10, pass2: 1 + 0 = 1], 100 [sum: 1 + 0 + 0 = 1] like these.
Problem is: My code is working fine for input: 91, sum of digits of 91 is: pass1: 9 + 1 = 10, pass2: 1 + 0 = 1. It is showing correct output. But not working for any other input like: 100, 55 etc.
Code:
def Single_Digit_Summer(num):
sum = 0
while num > 0:
sum += num % 10
num = num // 10
return Single_Digit_Summer(sum) if sum > 9 else sum
num15 = int(input("Enter number to check: "))
sum2 = Single_Digit_Summer(num15)
print(num15, "is a magic number.") if sum2 == 1 else print(num15, "is not a magic number.")
Output:
Case - 1:
Terminal: Enter number to check:
Terminal input:91
Terminal output: 91 is a magic number.
Actual var 'sum' value: 1
Expected var 'sum' value: 1
Expected output: 91 is a magic number.
Decision: OK
Case - 2:
Terminal: Enter number to check:
Terminal input:100
Terminal output: 100 is not a magic number.
Actual var 'sum' value: 0
Expected var 'sum' value: 1
Expected output: 100 is a magic number.
Decision: Wrong Output
Case - 3:
Terminal: Enter number to check:
Terminal input:55
Terminal output: 55 is not a magic number.
Actual var 'sum' value: 5
Expected var 'sum' value: 1
Expected output: 55 is a magic number.
Decision: Wrong Output
Case - 4:
Terminal: Enter number to check:
Terminal input: 1009
Terminal output: 1009 is not a magic number.
Actual var 'sum' value: 9
Expected var 'sum' value: 1
Expected output: 1009 is a magic number.
Decision: Wrong Output
Requirement: I can't find why my code is not working. help me debugging.