I'm trying to round down Decimal
values using NSDecimalRound(...
. For the most part it's behaving according to my expectations, but now and again it returns a value that takes me by surprise. In the example below the first calculation does what I would expect, that is, with a scale
of 4
, rounds 2341.2143
down to 2341.2143
(i.e. the same value). The second calculation is identical except that the value being rounded now is 2341.2142
(note last digit is 2 not 3), this time the returned value is 2341.2141
whereas I would expect a value of 2341.2142
. Similar behaviour is found with 2341.2146
, which becomes 2341.2145
when rounded down to 4 d.p. Can anyone explain why this happens?
// Working as expected... ////////////
var unrounded: Decimal = 2341.2143
var rounded = Decimal()
NSDecimalRound(&rounded, &unrounded, 4, .down)
print(rounded) // 2.2143
// A bit unexpected (to me at least) /////////
var unrounded: Decimal = 2341.2142
var rounded = Decimal()
NSDecimalRound(&rounded, &unrounded, 4, .down)
print(rounded) // 2.2141
CodePudding user response:
The problem comes from the literal which is Double
and which has to be converted to a Decimal
. So the error comes from the conversion:
var unrounded: Decimal = 2341.2143
// ^-- decimal var ^-- Double literal
As you know, the floating point encoding is error prone. You can check here that:
2341.2143 is encoded as 2341.21435546875 and rounded down makes 2341.2143
2341.2142 is encoded as 2341.214111328125 and rounded down makes 2341.2141
You need to initialize the Decimal
to benefit from the superior precision. Unfortunately, this is not intuitive. You can for example use, the following, which behaves as expected:
var unrounded3: Decimal = Decimal(sign:.plus, exponent:-4, significand:23412142)
var rounded3 = Decimal()
NSDecimalRound(&rounded3, &unrounded3, 4, .down)
print(rounded3)