Home > Enterprise >  How to continously wait on any of multiple concurrent tasks to complete?
How to continously wait on any of multiple concurrent tasks to complete?

Time:01-26

Let's say there are multiple sources of events I want to monitor and respond to in an orderly fashion - for instance multiple connected sockets.

What's the best way to continuously await until any of them has data available to be read?

asyncio.wait seems promising, but I am unsure about how to make sure tasks for sockets, that were just read from, get re-added into the list of tasks to await on.

I tried to re-schedule all of the reads every time the loop ran, but that (obviously) didn't work.

As a hack, I came up with cancelling pending tasks each iteration of the loop. The code I currently have, looks like this. But I'm not sure it's actually correct in all cases.

while True:
    done, pending = await asyncio.wait([socket1.read(), socket2.read()], return_when=FIRST_COMPLETED)

    for received in done:
        ...

    for to_cancel in pending:
        to_cancel.cancel()

What would be the most elegant (and correct!) way of doing this?

CodePudding user response:

Just re-create a task for calling easch .read() method every time one of the sockets returns. By wrapping the co-routine in a task, you can associate arbitrary metadata to it (in the form of plain Python attributes), and then it is easy to track which task should be re-created:

async def worker():

    pending_sockets = [socket1, socket2]
    pending_tasks = []
    while True:
        for sock in pending_sockets:
            task = asyncio.create_task(sock.read())
            task.source = sock
            pending_tasks.append(task)

        done, pending_tasks = await asyncio.wait(pending_tasks, return_when=FIRST_COMPLETED)

        pending_sockets = []
        for received in done:
            ...
            pending_sockets.append(received.source)
  • Related