Home > Enterprise >  What is the difference (if any) between "defined" and "well-defined"?
What is the difference (if any) between "defined" and "well-defined"?

Time:07-15

Recently I've seen this comment w.r.t. behavior (emphasis added):

Generally, it's not undefined, but not entirely well-defined either.

It confused me a bit, because if the behavior is not undefined, then it is defined. Hence, what is the difference (if any) between defined behavior and well-defined behavior?

Note that the standard has no term well-defined behavior. However, it has term well-defined semantics. So, the question can be similar: what is the difference (if any) between defined semantics and well-defined semantics?

In general: what is the difference (if any) between defined and well-defined?

Reason of the question: better understanding of the standard.

CodePudding user response:

Well-defined is what we usually call code containing neither undefined, unspecified or implementation-defined behavior. The definition of a strictly conforming program from chapter 4 might be helpful:

A strictly conforming program shall use only those features of the language and library specified in this International Standard. It shall not produce output dependent on any unspecified, undefined, or implementation-defined behavior, and shall not exceed any minimum implementation limit.

Any snippet of C code fulfilling the above could informally said to be well-defined, as in defined by the ISO C language standard and portable. As opposed to for example implementation-defined meaning documented compiler-specific behavior.

CodePudding user response:

Coming from a mathematical perspective, a well-defined expression is considered to be an expression which produces a consistent or unique output for a given input. To take the examples from the wiki (because they're quite clear):

  • A function being undefined does not make it not-well-defined. For instance, f(x)=1/x would be undefined at 0, but would still be a well-defined function. Simply put, 0 isn't in the domain of f.
  • For a real-numbered input, if for example f(0.5) != f(1/2), then f is not well-defined (similar inputs producing different outputs).
  • For sets B and C, where A is the union of B and C, let the function f be a mapping of A->{0,1} such that f(b) = 0 and f(c) = 1, where b is an element of B, and c is an element of C. Then f is well-defined iff B and C have no common elements. This is to say, that if B=C={2}, then f(2)=1 and f(2)=0, which is ambiguous.

Taking this definition to programming, I would have to say that they meant one of two things:

  • The expression in question is either not consistent (mathematically) or does not provide a consistent output (colloquially)
  • Informally, the expression must do some operation, but is not restricted in the means which it may accomplish it. This is usually the case for implementation-specific behavior (e.g. many operations that can be defined differently depending on the compiler used).
  • Related