Home > database >  Why can't C infer array size with offset?
Why can't C infer array size with offset?

Time:10-03

This code fails to compile

template<unsigned n>
void test(char const (*)[n   1]) { }

int main()
{
    char const arr[] = "Hi";
    test(&arr);
}

with error

note: candidate template ignored: couldn't infer template argument 'n'

However, if you change n 1 to n, it compiles just fine.

Why can't the compiler deduce n when it has an offset added to it?

CodePudding user response:

From cppreference, in the " Non-deduced contexts" section:

In the following cases, the types, templates, and non-type values that are used to compose P do not participate in template argument deduction, but instead use the template arguments that were either deduced elsewhere or explicitly specified. If a template parameter is used only in non-deduced contexts and is not explicitly specified, template argument deduction fails.

(...)

  1. A non-type template argument or an array bound in which a subexpression references a template parameter:
template<std::size_t N> void f(std::array<int, 2 * N> a);
std::array<int, 10> a;
f(a); // P = std::array<int, 2 * N>, A = std::array<int, 10>:
      // 2 * N is non-deduced context, N cannot be deduced
      // note: f(std::array<int, N> a) would be able to deduce N 

(...)

In any case, if any part of a type name is non-deduced, the entire type name is non-deduced context. (...)

Because n 1 is a subexpresion, the whole context then cannot be deduced.

  •  Tags:  
  • c
  • Related