Home > Back-end >  Default Arguments vs Overloading?
Default Arguments vs Overloading?

Time:09-22

In WG21 N0131 Bill Gibbons states:

default arguments are often considered an anachronism because they can be replaced with overloaded functions

I understand that a single function like:

void f(T t = t0, U u = u0);

can be replaced by the three overloads:

void f() { f(t0); }
void f(T t) { f(t, u0); }
void f(T t, U u);

but what I don't understand is why the latter should be preferred over the former? (That is what he means by "anachronism", right?)

There's some related discussion in the Google styleguide here: Google C Styleguide > Default Arguments, but I don't see how it answers the question or supports Gibbons claim.

Anyone know what he's talking about? Why are default arguments considered an anarchronism?

CodePudding user response:

I would first of all refer to this article on FluentC which addresses this very question and gives a clear personal answer near to the top of the post:

By default, I think that we should prefer default parameters rather than overloads.

However, as the By default implies, the author gives merit to overloads in favour of default parameters is some peculiar situations.

My original answer follows, but I have to say: the article linked above did reduce substantially my repulsion for default arguments...

Given void f(T t = t0, U u = u0);, you have no way to call f with a custom u and letting t be the default t0 (unless you manually call f(t0, some_u), obviously).

With the overloads, it's easy: you just add f(U u) to the set of overloads.

So with overloads you can do what you can do with default arguments, plus more.

Besides, since with this question we are already in the land of opinions, why not mentioning the fact that you can re-declare functions by adding more defaults? (Example taken from cppreference.)

void f(int, int);     // #1 
void f(int, int = 7); // #2 OK: adds a default 
void f(int = 1, int); // #3 OK, adds a default to #2

And the fact that the definition of a function cannot re-define a default argument if a previous declaration of the function defines it (for a pretty clear and understandable reason)?

void f(int, int = 7); // in a header file
void f(int, int) {} // in a cpp file correct
void f(int, int = 7) {} // in a cpp file wrong

Yes, maybe the default arguments are an "interface thing", so probably not seeing a sign of it in an implementation file is fine...

CodePudding user response:

From my own experience, the problem is that of violating the principle of least astonishment when interacting with other language features. Let's say you have a component that uses f a lot. I.e. you see this in plenty of places:

f();

From reading it, you assume you have a function that takes no arguments. So when you need to add interaction with some other component that has a registration function:

void register(void (*cb)());

you do the obvious thing...

register(f);

... and you immediately get a nice shiny error because the declared type of f is a function that takes two arguments. Wtf!? So you look at the declaration and understand... right...

The default arguments make your code behave a certain way via the compiler "fudging" the call site to make things work. It isn't really calling a function with no argument, but implicitly initializes two arguments to call the function with.

On the other hand, the overload set does behaves how one would expect. There is no "fudging" of the call site by the compiler, and when we try to register(f)... it works!

CodePudding user response:

Anachronism means that something that stands out for being in the present given that it is widely considered to be a thing of the past.

The rest of my answer admittedly sucks for StackOverflow because its a matter of opinion... but the question itself supposes that there isn't a hardfast "answer".

As for why default args are a thing of the past, there could be many examples.... the best one that comes to mind for myself however is that especially when writing a set of reusable functions, we want to reduce the potential for mis/incorrect use.

Consider the following:

void f(int i = 0, char c = 'A'){std::cout << i << c << std::endl;}

Now consider that someone attempts to use it as follows:

f('B');

They prolly expected to see output:

0B

What they get however is:

66A

Upon seeing the output they understand their mistake and correct... but if you remove the default parameters and instead force the use of one of a couple specific overloads that will accommodate single param of either type... then you have made a more robust interface that provides what would be the expected output every time. The default args work... but they aren't necessarily the most "clear" in the case of development when someone forgets that if at least one arg is supplied in the function call, only the trailing args can be defaulted.

In the end, what matters is that the code works... but if you saw code with labels and goto statements, you'd be like: oh really? They work fine... but damn they can be misused... Switching languages to stress the subjective nature of the discussion in general... if JavaScript works well and provides so much freedom given the the nature of its variables having mutable type... why on earth would anyone WANT to use TypeScript? Its a matter of simplifying/enforcing proper REUSE of the code. Otherwise who cares as long as it works...

  •  Tags:  
  • c
  • Related