Home > Net >  Python Multiprocessing doesn't seem to end
Python Multiprocessing doesn't seem to end

Time:07-09

I have a process which moves lots of data into a database. I use multiprocessing for this.

It runs nice and quickly, but even when it's finished (all the rows are moved), it doesn't seem to end.

I've added join as I thought it means that the process won't terminate until all other processes are complete

Is there something I have missed here? why doesn't it end?

p=mp.Pool(cpu_count())
p.map(do_process, result)
p.close()
p.join()

CodePudding user response:

So a nicer way of doing it, it to use joblib:

import joblib

with joblib.parallel_backend('loky'):
    results = joblib.Parallel(n_jobs=-1)(
        joblib.delayed(do_process)(item) 
        for item in result
        )

I am assuming here that that the result object is mapped over the do_process function. The with statement ensure closing the loop. If you want to know of what is happening, then you can add a verbosity setting to the joblib.Parralel object.

Hope it helps

  • Related