How can I make shure that one process starts before the other?
Background: I am doing computations in two separate processes (each of them running in a loop). But for the computations in one process I need the results from the computations of the other process. The first thought might be: why not do it synchronous? But this one calculation runs faster than the other one. And if there are no new results available, it should just continue with the last available value.
Example Code:
from multiprocessing import Process, Queue
import time
import random
queue = Queue()
def some_calculation():
# this is slower
time.sleep(2)
return random.randint(0, 10)
def some_other_calculation(required):
# this is faster
time.sleep(0.5)
return required random.randint(0, 10)
def first_target(queue):
while True:
# doing some computations
res = some_calculation()
queue.put(res)
def second_target(queue):
res_list= []
# req = 5 as an initial guess might also work, but lets
# assume this is not an option
while True:
try:
req = queue.get(block=False)
except:
print('no new value available, reuse old value')
res = some_other_calculation(req)
res_list.append(res)
print('results so far:', res_list)
if __name__ == '__main__':
proc1 = Process(target=first_target, args=(queue,))
proc2 = Process(target=second_target, args=(queue,))
proc1.start()
# time.sleep(3) here everything works fine, but is there
# maybe a more elegant solution?
proc2.start()
Is there a way -- aside from just sleeping or predefining a value for req
-- to make sure, that this one process really starts before the other one does?
Many thanks for any help in advance!
CodePudding user response:
The idea is for second_target
to start a thread that will issue blocking get
calls on the queue and thus will constantly update the req
variable with the latest value available. second_target
also needs to make an initial synchronous call to some_calculation
to get its initial value of req
in order to greatly simplify the logic. I have also replace the infinite while True
loop in second_target
with a more limited loop for demo purposes in order that the program terminates:
from multiprocessing import Process, Queue
import time
import random
from threading import Thread
def some_calculation():
# this is slower
time.sleep(2)
return random.randint(0, 10)
def some_other_calculation(required):
# this is faster
time.sleep(0.5)
return required random.randint(0, 10)
def first_target(queue):
while True:
# doing some computations
res = some_calculation()
queue.put(res)
def second_target(queue):
def monitor_queue():
nonlocal queue, req
while True:
req = queue.get()
res_list= []
req = some_calculation() # Call it synchronously for initial value
last_req = req
Thread(target=monitor_queue, daemon=True).start()
for _ in range(10): # for testing purposes so we terminate:
#while True:
if req == last_req:
print('no new value available, reuse old value')
else:
print('got a new value')
last_req = req
res = some_other_calculation(req)
res_list.append(res)
print('results so far:', res_list)
if __name__ == '__main__':
queue = Queue()
p = Process(target=first_target, args=(queue,), daemon=True)
p.start()
second_target(queue)
Prints:
no new value available, reuse old value
results so far: [5]
got a new value
results so far: [5, 2]
no new value available, reuse old value
results so far: [5, 2, 2]
no new value available, reuse old value
results so far: [5, 2, 2, 7]
no new value available, reuse old value
results so far: [5, 2, 2, 7, 11]
got a new value
results so far: [5, 2, 2, 7, 11, 11]
no new value available, reuse old value
results so far: [5, 2, 2, 7, 11, 11, 13]
no new value available, reuse old value
results so far: [5, 2, 2, 7, 11, 11, 13, 13]
got a new value
results so far: [5, 2, 2, 7, 11, 11, 13, 13, 7]
no new value available, reuse old value
results so far: [5, 2, 2, 7, 11, 11, 13, 13, 7, 11]