When functions nest, the concept requirement checks often duplicate.
Look at the example below,
template<typename I>
requires std::forward_iterator<I>
auto fun1(I i){ ... };
template<typename I>
requires std::forward_iterator<I>
auto fun2(I i){
fun1(i);
....
}
fun1 is called inside fun2, thus std::forward_iterator<I>
is checked twice.
I want to know whether this kind of duplication of requirement checks negatively affects compile time or not.
Or Do you think we should endeavor to reduce duplication of requirement checks?
Addition :
We can some what avoid the duplication of requirement checks like below,
template<typename I>
auto fun1_unchecked(I i) { ... }
template<typename I>
requires std::forward_iterator<I>
auto fun1(I i) {
return fun1_unchecked(i);
}
template<typename I>
requires std::forward_iterator<I>
auto fun2(I i) {
fun1_unchecked(i);
...
}
But I don't know this worth the effort.
CodePudding user response:
If fun1
is called by someone other than fun2
, then the presence of the concept is not duplication. Code which calls just fun1
needs a conceptualized interface for it as well. They're two independent functions; one of them just so happens to call the other.
If fun1
is only called by fun2
, then the concept could be removed. But otherwise, it's a meaningful part of your system and should have a proper interface.
As for the compile-time costs of "duplication", the standard gives C implementations all the leeway they need to minimize any costs. For any given type T
, if a compiler (of one file) sees concept_name<T>
, the compiler can freely assume that every subsequent use of concept_name<T>
will produce the same value. So it can effectively cache it.
So any costs of "duplication" should be minimized.
Do you think we should endeavor to reduce duplication of requirement checks?
No. Not unless you have actual performance metrics from actual compilers which tell you that there is an actual performance problem.