Home > Back-end >  Why double a=1/2 could go wrong?
Why double a=1/2 could go wrong?

Time:11-07

In c + + defines a double a=1/2, but the output when a=0, what is this principle?

CodePudding user response:

You try a double a=1/2.0

CodePudding user response:

Two type int do division, the result is an int type, 0.5 to type int variable 0, and then assigned to a, to double a=1.0/2

CodePudding user response:

Or divisible

CodePudding user response:

To double a=1.0/2 or double a=1/2.0, because the two integers 1/2 division results to 0, you assigned to a double a, certainly also is 0
  • Related