I have this code which converts a string to an int
unsigned int formatInt(char *ptr) {
int res;
if (sscanf(ptr, "%i", &res) == -1) exit(-1);
return res;
}
I fed it a char *
pointing to the first char of "00000000041".
Conversion to int
returns me 33 (Implicit Octal to Decimal conversion)
"00000000041" is actually a string (char[12]
), but it's the size of a file in octal.
How did the compiler know it was in octal ? 00000000041 could perfectly be a decimal (41)
CodePudding user response:
Recognizing the string as octal is a function of the %i
format specifier to scanf
. From the man page:
i Matches an optionally signed integer; the next pointer must be a pointer to int. The integer is read in base 16 if it begins with 0x or 0X, in base 8 if it begins with 0, and in base 10 otherwise. Only characters that correspond to the base are used.
So because the string begins with a 0 and %i
was used, it is interpreted as an octal string.