Home > Blockchain >  How to run python function every 1 sec in parallel
How to run python function every 1 sec in parallel

Time:01-07

I was thinking to use multiprocess package to run a function in parallel but I need to pass a different value of parameter every run (every 1 sec).

e.g.)

def foo(list):
  while True:
    <do something with list>
    sleep(1000)

def main():
  process = multiprocess.Process(target=foo, args=(lst))
  process.start()

  <keep updating lst>

This will cause a foo() function running with the same parameter value over and over. How can I work around in this scenario?

CodePudding user response:

Armed with the knowledge of what you're actually trying to do, i.e.

The foo function does an http post call to save some logs (batch) to the storage. The main function is getting text logs (save log to the batch) while running a given shell script. Basically, I'm trying to do batching for logging.

the answer is to use a thread and a queue for message passing (multiprocessing.Process and multiprocessing.Queue would also work, but aren't really necessary):

import threading
import time
from queue import Queue


def send_batch(batch):
    print("Sending", batch)


def save_worker(queue: Queue):
    while True:
        batch = queue.get()
        if batch is None:  # stop signal
            break
        send_batch(batch)
        queue.task_done()


def main():
    batch_queue = Queue()
    save_thread = threading.Thread(target=save_worker, args=(batch_queue,))
    save_thread.start()
    log_batch = []

    for x in range(42):  # pretend this is the shell script outputting things
        message = f"Message {x}"
        print(message)
        log_batch.append(message)
        time.sleep(0.1)
        if len(log_batch) >= 7:  # could also look at wallclock
            batch_queue.put(log_batch)
            log_batch = []
    if log_batch:
        batch_queue.put(log_batch)  # send the last batch
    print("Script stopped, waiting for worker to finish")
    batch_queue.put(None)  # stop signal
    save_thread.join()


if __name__ == "__main__":
    main()

CodePudding user response:

import threading
import time

def run_every_second(param):
    # your function code here
    print(param)

# create a list of parameters to pass to the function
params = [1, 2, 3, 4]

# create and start a thread for each parameter
for param in params:
    t = threading.Thread(target=run_every_second, args=(param,))
    t.start()
    time.sleep(1)

# wait for all threads to complete
for t in threads:
    t.join()

This will create a new thread for each parameter, and each thread will run the run_every_second function with the corresponding parameter. The threads will run concurrently, so the functions will be executed in parallel. There will be a 1-second time lapse between the start of each thread.

  • Related