Home > Net >  Why double and %f don't want to print 10 decimals?
Why double and %f don't want to print 10 decimals?

Time:01-24

I am learning c programming language and am figuring out format specifiers, but it seems as if double and %f are not working corectly.

Here is my code

#include <stdio.h>
int main(void)
{ 
    double a = 15.1234567899876;
    printf(".10f", a);
}

In my textbook it's stated that in ".10f" 13 stands for total number of digits we want to be printed(including dot) and 10 is number of decimals. So i expected to get 15.1234567899 but didn't.

After running it I get 15.1234567900. It's not just not enough decimals, but decimals are not printed correctly. Variable a has 8 after 7 and before 9, but printed number does not.

Can someone please tell me where am I wrong.

Thank you. Lp

CodePudding user response:

printf is supposed to round the result to the number of digits you asked for.

  you asked: 15.1234567899876
    you got: 15.1234567900
digit count:    1234567890

So printf is behaving correctly.

You should beware, though, that both types float and double have finite precision. Also their finite precision is as a number of binary bits, not decimal digits. So after about 7 digits for a float, and about 15 digits for a double, you'll start seeing results that can seem quite strange if you don't realize what's going on. You can see this if you start printing more digits:

printf(".15f\n", a);

  you asked: 15.1234567899876
    you got: 15.123456789987600

So that's okay. But:

printf("#.20f\n", a);

  you asked: 15.1234567899876
    you got: 15.12345678998759979095

Here we see that, at the 15th digit, the number actually stored internally begins to differ slightly from the number you asked for. You can read more about this at Is floating point math broken?

  • Related