Home > Net >  Why is decimal precision culture dependent?
Why is decimal precision culture dependent?

Time:09-29

I wrote a test performing an assertion on a formatted string, when I noticed that the number of decimals in a string formatted with the percent format specifier (P) was different on some cultures.

In the example below, en-US uses two decimal digits, while the other sampled cultures use three.

What is the heuristic behind this? Is there a way, other than rounding, of normalizing the different cultures to the same precision?

Console.WriteLine(0.19999d.ToString("P", new CultureInfo("en-US"))); // "20.00%"
Console.WriteLine(0.19999d.ToString("P", new CultureInfo("en-GB"))); // "19.999%"
Console.WriteLine(0.19999d.ToString("P", new CultureInfo("en"))); // "19.999%"
Console.WriteLine(0.19999d.ToString("P", new CultureInfo("nb-NO"))); // "19,999 %"
Console.WriteLine(0.19999d.ToString("P", new CultureInfo("nl-NL"))); // "19,999%"

Runtime: 6.0.9
Platform: Windows

CodePudding user response:

They are different because that's what those cultures have determined to be appropriate. If you want a specific number of decimal places rather than accepting the culture-specific default then include that in your format specifier, e.g. "P2".

Aletrnatively, you can create your own CultureInfo or NumberFormatInfo that sets the relevant properties as you want.

  • Related