Home > Enterprise >  Casting double to int works with constants only
Casting double to int works with constants only

Time:09-21

I found a strange (to me) behavior of casting to int in C. Appologies if that's a basic question but I'm unable to find the answer to why the following code produces an unexpected result.

#include <stdio.h>
int main(void)
{
    printf("1000 * 0.1 = %d\n", (1000 * 0.1));
    printf("1000 * (10/100) = %d\n", (1000 * (10/100)));
    printf("(int)1000 * 0.1 = %d\n", (int)(1000 * 0.1));
    printf("(int)1000 * (10/100) = %d\n", (int)(1000 * (10/100)));

    return 0;
}

Result with both -O0 and -O3 is the same:

1000 * 0.1 = -957043896
1000 * (10/100) = 0
(int)1000 * 0.1 = 100
(int)1000 * (10/100) = 0

I expect a non-sensical result for the first two (I don't know why but I expect passing a double to int argument shouldn't work). However the difference between 3 and 4 is puzzling to me. I expected (10/100) to be calculated on compile time and render the same result as 3.

Can someone explain to me why such result happens, and what is the proper/safe way to do integer-based divisions here?

CodePudding user response:

printf("1000 * 0.1 = %d\n", (1000 * 0.1));
//                           int    double

int * double gives a double. You're trying to print a double with "%d" which is Undefined Behaviour (usually written UB). Try printf("1000 * 0.1 = %f\n", 1000 * 0.1);

printf("1000 * (10/100) = %d\n", (1000 * (10/100)));
//                                int    int int

int / int does integer division, with no decimals. 10/100 yields 0 (and a remainder of 10)

printf("(int)1000 * 0.1 = %d\n", (int)(1000 * 0.1));
//                                     int    double

the double 100.0 is converted to int and printed naturally. Note that it's possible that 1000 * 0.1 would generate 99.999999635264318 which would convert to 99

printf("(int)1000 * (10/100) = %d\n", (int)(1000 * (10/100)));
//                                          int    int int

10/100 is integer division ... is 0 in this case, same as 2nd statement above.

CodePudding user response:

There is a difference for example between these two calls of printf

printf("1000 * 0.1 = %d\n", (1000 * 0.1));
printf("1000 * (10/100) = %d\n", (1000 * (10/100)));

In the firs call the expression 1000 * 0.1 has the type double but you are using the incorrect conversion specifier %d that is designed for objects of the type int.

In the second call the expression 1000 * (10/100) indeed has the type int. Pay attention to that the casting of the expression to the type int like (int)(1000 * (10/100)) is redundant and does not make a sense. Instead you could write for example ( int )( 1000 * (10.0/100) ). So this call is correct. However the value of the original outputted expression is equal to 0 due to the integer arithmetic.

  • Related