#include
Int main ()
{
Int n, I;
Double sum=0.0;
The scanf (" % d ", & amp; N);
For (I=1; i<=n; I++) {
Sum=the sum of a + 1/I;
}
Printf (" %. 6 f ", sum);
return 0;
}
The above program output has been 1.000000
Where is the above program went wrong?
CodePudding user response:
I was int, so 1/I was converted to int, int result is that when I is greater than 1 0So to
Sum=sum + 1.0/I//1.0 is double, even if the result is no longer an int