I'm coding a series of modules that encapsulate business logic. They are included as mixins in base classes in any number of different combinations.
Each one of this module has it's own initialize
method defined, because it specifies it's required keyword arguments.
The idea was to call super
inside these Module's initialize
method, so arguments would climb up the ancestor chain and reach other module's initialization methods as well.
However, there's a problem with this pattern:
module SomeModule
def initialize(arg1: , arg2: , **rest)
@arg1 = arg1
@arg2 = arg2
super
end
end
This achieves the goal of modularity, making so that any class that has this Module will have those two arguments as required, and pass up the ancestor chain for the initialization in other modules. However, the last module in the chain will invoke super
, and that will raise an exception because it's actually reaching BasicObject
's #initialize, which takes no arguments.
Even if you use super if defined?(super)
the same problem arrives, because BasicObject has
the super
method (#initialize
).
I tried using any metaprogramming trick to detect the arity of the super method, but method(__method__)
will return a Method
object to the original class (first in the ancestor chain).
So, how to achieve this goal of modularity, making so that included Modules keep calling #super
in their initialize method up until there are no more modules left, without reaching BasicObject#initialize
and blowing up, and without having to worry about the order of inclusion of these Modules?
UPDATE: I'd really like to avoid having to rescue the exception to BasicObject#initialize
.
UPDATE2: calling super(**rest)
also doesn't cut it, because if any Module in the ancestor chain also depends on an argument that was already absorved by a previous Module, that argument will be missing upstream.
CodePudding user response:
You could have your classes inherit from some base class that implements initialize
accordingly, e.g.:
class Base
def initialize(**)
super()
end
end
class MyClass < Base
include SomeModule
end
MyClass.new(arg1: 'foo', arg2: 'bar')
#=> #<MyClass:0x00007fa394843bb0 @arg1="foo", @arg2="bar">
CodePudding user response:
Answering generically to you question probably would involve a lot of trickery with metaprogramming (if it's possible at all) and this is definitely something that should be avoided, but I'll try to provide an answer to your particular case.
I would recommend to use prepend
instead of include
, assuming you are not prepending anything else to these classes that require to have this set of initialize
methods and catch the final super in the class itself:
module M1
def initialize(arg1:, **kwargs)
puts "initialize from M1 with arg1 = #{arg1}"
super
end
end
module M2
def initialize(arg2:, **kwargs)
puts "initialize from M2 with arg1 = #{arg2}"
super
end
end
class C1
prepend M1
prepend M2
def initialize(**kwargs)
puts "initialize from C1"
end
end
C1.new(arg1: 'A1', arg2: 'A2')
In this case you can prepend your modules in the order you want and the final super
will call C1#initialize
, so it will be your stopping point in this chain of calls. If you want to call another super
from C1#initialize
, just use the variant with arguments and provide just what the next initialize
expects (it shouldn't know anything about arg1
, arg2
, etc.).