I have been trying to solve this problem :
The following iterative sequence is defined for the set of positive integers:
n → n/2 (n is even)
n → 3n + 1 (n is odd)Using the rule above and starting with 13, we generate the following sequence:
13 → 40 → 20 → 10 → 5 → 16 → 8 → 4 → 2 → 1It can be seen that this sequence (starting at 13 and finishing at 1) contains 10 terms. Although it has not been proved yet (Collatz Problem), it is thought that all starting numbers finish at 1.
Which starting number, under one million, produces the longest chain?
NOTE: Once the chain starts the terms are allowed to go above one million.
and I implemented the below code, but this doesn't seem to give me the correct answer. It computes 910107 as the starting number that gives the longest chain, but the answer should be 837799. What's wrong with it?
#include<stdio.h>
int main(void)
{
int count=1;
int last_count =0;
int num=13;
int temp;
int Largest_Num=0;
for(int i=num;i<1000000;i++)
{
temp = i;
while(temp>1)
{
if(temp % 2 == 0)
{
temp/=2;
}
else
{
temp =(3*temp)+1;
}
count++;
}
if(last_count < count)
{
last_count = count;
Largest_Num = i;
}
count =1;
}
printf("%d\n",last_count);
printf("%d",Largest_Num);
return 0;
}