Recently I received a feedback on a issue with the following warning from a colleague, who uses Coverity, a static analysis tool.
Passing the value of a large parameter (PASS_BY_VALUE)
pass_by_value: Passing parameter parameter_name of type class_name (size 184 bytes) by value, which exceeds the low threshold of 128 Passing parameter
This led me to wonder about how the size of the parameter will affect the quality of the application, and when will it be a major concern if the size of the parameter get out of hand?
And what is a good rule of thumb to keep it in check?
CodePudding user response:
Pass by value | Pass by reference | Advantage | |
---|---|---|---|
cost of passing the argument | copy object | copy pointer | by value, if sizeof(T) <= sizeof(T*) ,by ref, if sizeof(T) > sizeof(T*) |
object access | direct | indirect | by value |
can alias | no the optimizer can do more access optimizations |
yes optimizer is restricted in access optimization |
by value |
For object smaller or equal in size to the size of a pointer it's a no brainer: pass by value has no disadvantages compared to pass by ref, only advantages.
As you consider bigger objects the cost of copying will eventually outweigh the disadvantages of pass by ref. Where that line is it's difficult to tell. And that line differs on different architectures and calling conventions. For your environment Coverity draws the line at 128 bytes, which I am sure is arrived at by smart people with experience and from doing relevant tests. You can use that limit, you can use another one if you want or you can profile your application to see what applies to your case.