Home > Software design >  Why does scanf first return 1 and then 0 when it gets decimal input - but expects integer?
Why does scanf first return 1 and then 0 when it gets decimal input - but expects integer?

Time:09-09

Why does the following code execute - when a decimal number is entered - at first, but on the second iteration scanf directly returns 0, and the loop breaks?

#include <stdio.h>


int main(){
    int num = 1;

    while(num != 0){
        printf("input integer: ");

        if(scanf("%d", &num) == 0)
            break;
        
        printf("out: %d\n", num);      
    }
    return 0;
}

If I enter, for example, 5.55, it prints 5 (scanf returns 1 and printf executes), however on the next iteration it [scanf] returns 0 and the loop breaks. Why does scanf return 1 in the first place?

CodePudding user response:

Depending on the format specifier in your format string, scanf consumes as much input from input stream as matches for the specified argument type.

For %s it reads until it sees a space or end of line. For %d it reads as long as it sees digits.

If you use %d and then enter "5.55" in stdin, scanf will only consume the first '5' and the remaining ".55" will stay in the buffer. On your first call, scanf is able to convert input for the first parameter. As a result, it will return 1 and the value 5 is assigned to num.

The data in input buffer does not go away between calls to scanf.

In your second call you also use %d but now, scanf sees a '.' that does not match the required format. As a result, no input is consumed, no value is converted and the return value is 0.

This will stay the same for any following call until you consume input in another way.

  • Related