Home > Software design >  Does type deduction is failed in this example?
Does type deduction is failed in this example?

Time:08-13

Consider the following example:

template<typename T>
void f(T&) {};

int main(void){

    const int i = 0;
    const int& r = i;
    f(r); 
}   

Let's name the template parameter type as P and the template argument type as A. Then, P and A will be:

P = T&, A = const int;

Since P is a reference type, [temp.deduct.call]/2 is not applied, but I think [temp.deduct.call]/3 is, which says:

If P is a cv-qualified type, the top-level cv-qualifiers of P's type are ignored for type deduction. If P is a reference type, the type referred to by P is used for type deduction.

After applying the above quote, P and A will be:

P = T, A = const int; // T = const int;

I got stuck when I applied the same concept to this example:

template<typename T>
void g(const T&) {};

int main(void)
{
    int i = 0;
    g(i); 
}

The types of the template parameter P and its corresponding template argument A are:

P = const T&, A = int;

After applying the above quote, P and A will be:

P = const T, A = int; // Is type deduction failed here?

Per my knowledge, T cannot be deduced because argument type A requires a qualification conversion to match the parameter type P. Am I correct to say that? if no, what's the main reason that causes deducing const T from int fails? Just if possible, provide a rule from the C standard for that. This is my first question.

But when I tested that code in g 11.2.0, the program compiles fine, and the instantiated function is g<int>(const int&).

So it's seemed that the T has been deduced as int; why? this is my second question.

CodePudding user response:

Per my knowledge, T cannot be deduced because argument type A requires a qualification conversion to match the parameter type P. Am I correct to say that? This is my first question.

No, your assumption/understanding that a conversion from int to const int requires a qualification conversion is incorrect. An example of a qualification conversion would be something like from int* to const int*.


So it's seemed that the T has been deduced as int; why? this is my second question.

This can be understood from temp.deduct#call-4 which states:

In general, the deduction process attempts to find template argument values that will make the deduced A identical to A (after the type A is transformed as described above). However, there are three cases that allow a difference:

4.1) If the original P is a reference type, the deduced A (i.e., the type referred to by the reference) can be more cv-qualified than the transformed A.

4.2) The transformed A can be another pointer or pointer-to-member type that can be converted to the deduced A via a function pointer conversion and/or qualification conversion.

(emphasis mine)

Now we can apply this to your example:

template<typename T>
void g(const T&) {};

int main(void)
{
    int i = 0;
    g(i); 
}

In above, P is const T& so it is adjusted to const T. And since the passed argument is A = int, the deduced T will be int while the deduced A will be const int.


The same explanation can be found here:

template<typename T>
void f(const T& t);
bool a = false;
f(a); // P = const T&, adjusted to const T, A = bool:
      // deduced T = bool, deduced A = const bool
     // deduced A is more cv-qualified than A

CodePudding user response:

Scott Meyers' Effective Modern C explains exactly this case in the Item 1: Understanding template type deduction, Case 1: ParamType is a Reference or Pointer, but not a Universal Reference.

Considering a code structure as below:

template <typename T>
void f(ParamType param);

f(expr);

Where we want to deduce T and ParamType from expr.
And given the case that ParamType is a reference (not a universal reference; it's const T& in your example).
The deduction would work as:

  1. If expr's type is a reference, ignore the reference part.
  2. Then pattern-match expr's type against ParamType to determine T.

For the particular case of:

template<typename T>
void g(const T&);

int i = 0;
g(i); 
  1. i's type (expr's type) is int.
  2. Pattern matching int against const T& (ParamType) determines T's type as int.

I can try reverse engineering my answer with references to cppreference and The Standard:

  1. cppreference: from Eljay's comment link:
  • The comment 3) If P is a reference type, the referenced type is used for deduction. tells you to adjust P's type to const T.
  • Then you have a detailed example of how deduction works under 1) If P is a reference type, the deduced A (i.e., the type referred to by the reference) can be more cv-qualified than the transformed A:.
template<typename T>
void f(const T& t);
 
bool a = false;
f(a); // P = const T&, adjusted to const T, A = bool:
      // deduced T = bool, deduced A = const bool
      // deduced A is more cv-qualified than A
  1. The Standard: from temp.deduct.call/3: If P is a cv-qualified type, the top-level cv-qualifiers of P's type are ignored for type deduction. If P is a reference type, the type referred to by P is used for type deduction.. Example 3 below that text contains almost exactly your example, apart from the fact that the argument is const int.
template<class T> int f(const T&);
const int i = 0;
int n2 = f(i);  // calls f<int>(const int&)
  • P's type is const T&.
  • Since P is a reference type, the type referred to by P, i.e. const T, is used for type deduction.
  • In your example, A's type is int. Deducing T as int would match P's adjusted type as const int.
  • Related