everyone,c primer plus chapter 6 ex12:
**Consider these two infinite series:
1.0 + 1.0/2.0 + 1.0/3.0 + 1.0/4.0 + ...
1.0 - 1.0/2.0 + 1.0/3.0 - 1.0/4.0 + ...
Write a program that evaluates running totals of these two series up to some limit of number of terms. Hint: –1 times itself an odd number of times is –1, and –1 times itself an even number of times is 1. Have the user enter the limit interactively; let a zero or negative value terminate input. Look at the running totals after 100 terms, 1000 terms, 10,000 terms. Does either series appear to be converging to some value?**
#include <stdio.h>
int main(void)
{
int times, a, b, d;
float sum1, sum2, c;
printf("Enter the times: ");
scanf("%d", ×);
while (times > 0)
{
sum1 = 0;
sum2 = 0;
for (a = times; a >= 1; a--)
sum1 += 1.0 / (float)a;
printf("The sum1 is %f\n", sum1);
for (b = times; b >= 1; b--)
{
c = -1.0;
while ((d = b) % 2 == 1)
{
c = 1.0;
d++;
}
sum2 += (c / (float)b);
}
printf("The sum2 is %f\n", sum2);
printf("Enter the times again: ");
scanf("%d", ×);
}
return 0;
}
what's wrong about my code?