Home > Software engineering >  Why a declaration is not a statement in C?
Why a declaration is not a statement in C?

Time:11-30

The following example is illegal C program, which is confusing and shows that a declaration is not a statement in C language.

int main() {
  if (1) int x;
}

I've read the specification of C (N2176) and I know C language distinguish declaration and statement in the syntax specification. I told my teacher who teaches compiler, and he seems not believe it and I cannot convince him unless I showed him the specification.

So, I am also really confused. Why C is designed like this? Why a declaration is not a statement in C? How to convince someone of the reason of this design?

CodePudding user response:

If it were valid, what would you like this problem to do ? :


#include <stdio.h>

int main (int argc, char **argv)
{
if (argc > 1) int x=42;

printf("%d\n", x);
return 0;
}

CodePudding user response:

Because a declaration doesn't do anything, it's purely informative for the compiler.

Consider this code:

int main ( ) 
{
    int x;
    printf("Hello World!\n");
    return 0;
}

What do you think will int x; do? You are declaring that x is of type int but you are never using x anywhere in the rest of the code. The compiler will thus not even reserve any memory on stack for it. The code the compiler will generate is exactly identical to the code the compiler generates when you compile:

int main ( ) 
{
    printf("Hello World!\n");
    return 0;
}

There is simply nothing a compiler must do if you let it know the type of a variable. This variable doesn't have to exist anywhere at all unless it is ever used by a statement.

C is not an interpreted language where every piece of code instructs the interpreter to do something. C is a compiled language which means you tell the compiler to generate CPU code for you that performs the actions you would like your program to perform. So there is no one-to-one relationship between the code you write and the CPU code the compiler generates.

You may write

int x = a / 8;

but the CPU code that the compiler generate may be equivalent to

int x = a >> 3;

As that is exactly the same thing and if shifting is faster than division (and you can bet it is), the compiler will not generate a division just because you told it to do so. What you told the compiler is "I want x to be one eighth of a" and the compiler will be like "okay, I'll generate code that makes this happen" but how the compiler is making it happen is up to the compiler.

Thus the compiler only needs to translate statements that actually have an effect. A declaration on its own has no effect, it just lets the compiler know the type of a variable or function.

  • Related