I have to find the value of 1+x/1!+x^2/2!+.....
Here is my code:
void main()
{
long int a, i = 0;
double p, e, x, sum = 1, t = 1;
printf("Enter desired number of decimal places\n");
scanf("%d", &a);
printf("Enter x\n");
scanf("%d", &x);
e = pow(10, -a);
do
{
t = x * t / (double) (i + 1);
sum = sum + t;
p = fabs(t);
i++;
printf("Sum is %ld\n", sum);
} while ((p > e) && (i < 10000));
}
My logic for the program is like this:
I'm letting the user input x and accuracy(a). My error is 10^(-a). i is my counter. sum is my sum of the series. t is denoting my terms.
I'm starting with t=1, and using the recursion t=(x*t)/(double)(i+1). [Note: since i is int and t is double, I'm doing a type conversion here.]
For example, if I choose x=5, then t=1 is initialized. Next I get t=5/1=5. Sum is initialized as 1. So sum now becomes sum=1+5=6. Then i becomes 1. So t becomes 5*5/2=12.5. So my sum now should be 6+12.5.... In this way it should go now in my do {} part.
Now, for the while part, p is the absolute value of t. The difference between the sum of (n+1) terms and n terms is this p. So when my p will be less than e, I should get my desired sum, right? In case the series is divergent, to avoid an infinite loop, I'm letting the do {} part run only as long as my i<10000.
The error I'm getting every time I run this program is I get the sum as 0. How is that even possible when I initialized sum as 1?
Please help me out with where I'm wrong. I'm new to programming.