Home > database >  Piping the output of a multiprocess program into multiple text files
Piping the output of a multiprocess program into multiple text files

Time:10-07

I have written a code that uses the multiprocessor library from python. It runs each script separately using Pool.

import os
from multiprocessing import Pool

process1 = ('myfirstfile.py',
            'mysecondfile.py',
            'mythirdfile.py',
            'myfourthfile.py')

def run_process(processes):
    os.system('python {}'.format(processes))

pool = Pool(processes=4)
pool.map(run_process, process1)

Although it executes properly, the code behind the myfirstfile.py and others produces output that I want to store separately for each .py file as it contains some information that I do not want to get mixed if two processes end the execution at the same time as it does happen. Also, running it on the terminal does not execute all the processes, it sometimes only runs half, sometimes only one.

What will be the proper way to pipe the outputs to separate output files.

CodePudding user response:

The subprocess module easily allows to capture your output. You can define a separate output file for each script, its filename based on the script name:

import subprocess
from multiprocessing import Pool
from pathlib import Path

scripts = ('myfirstfile.py',
           'mysecondfile.py',
           'mythirdfile.py',
           'myfourthfile.py')

def run_process(script):
    log_file = Path(script).stem   '.log'
    with open(log_file, 'w') as log_handle:
        subprocess.run(['python', script], check=True, text=True, stdout=log_handle, stderr=subprocess.STDOUT)

pool = Pool(processes=4)
pool.map(run_process, scripts)
  • Related