I have several string processing functions like:
def func1(s):
return re.sub(r'\s', "", s)
def func2(s):
return f"[{s}]"
...
I want to combine them into one pipeline function: my_pipeline()
, so that I can use it as an argument, for example:
class Record:
def __init__(self, s):
self.name = s
def apply_func(self, func):
return func(self.name)
rec = Record(" hell o")
output = rec.apply_func(my_pipeline)
# output = "[hello]"
The goal is to use my_pipeline
as an argument, otherwise I need to call these functions one by one.
Thank you.
CodePudding user response:
You can write a simple factory function or class to build a pipeline function:
>>> def pipeline(*functions):
... def _pipeline(arg):
... restult = arg
... for func in functions:
... restult = func(restult)
... return restult
... return _pipeline
...
>>> rec = Record(" hell o")
>>> rec.apply_func(pipeline(func1, func2))
'[hello]'
CodePudding user response:
You can just create a function which calls these functions:
def my_pipeline(s):
return func1(func2(s))
CodePudding user response:
Using a list of functions (so you can assemble these elsewhere):
def func1(s):
return re.sub(r'\s', "", s)
def func2(s):
return f"[{s}]"
def func3(s):
return s 'tada '
def callfuncs(s, pipeline):
f0 = s
pipeline.reverse()
for f in pipeline:
f0 = f(f0)
return f0
class Record:
def __init__(self, s):
self.name = s
def apply_func(self, pipeline):
return callfuncs(s.name, pipeline)
# calling order func1(func2(func3(s)))
my_pipeline = [func1, func2, func3]
rec = Record(" hell o")
output = rec.apply_func(my_pipeline)