Home > Enterprise >  Why the aiohttp request stucks if lock is used?
Why the aiohttp request stucks if lock is used?

Time:11-09

Why does this code:

import asyncio
import time
from multiprocessing import Pool, Manager
from threading import Thread, Lock

from aiohttp import ClientSession


async def test(s: ClientSession, lock: Lock, identifier):
    print(f'before acquiring {identifier}')
    lock.acquire()
    print(f'before request {identifier}')
    async with s.get('http://icanhazip.com') as r:
        print(f'after request {identifier}')
    lock.release()
    print(f'after releasing {identifier}')


async def main(lock: Lock):
    async with ClientSession() as s:
        await asyncio.gather(test(s, lock, 1), test(s, lock, 2))


def run(lock: Lock):
    asyncio.run(main(lock))


if __name__ == '__main__':
    # Thread(target=run, args=[Lock()]).start()
    with Pool(processes=1) as pool:
        pool.map(run, [Manager().Lock()])

prints:

before acquiring 1
before request 1
before acquiring 2

and then stucks? Why is the request with identifier 1 not being executed? The same with Thread (commented) Tried with requests, working.

CodePudding user response:

This is happening because you are mixing synchronous locks, which block an entire thread of execution, with asyncio, which requires all operations to be non-blocking. Both of your coroutines (the two calls to test) are running in the same thread, so when the second coroutine attempts to take the lock, but gets blocked, it also blocks the first coroutine (which holds the lock) from making any additional progress.

You can fix this by using an asyncio.Lock instead. It will only block the coroutine waiting on the lock, rather than blocking the entire thread. Note that this lock can't be passed between processes, though, so it will not work unless you stop using multiprocessing, which is not actually necessary in your example code above. You just create a single lock that you only use in a single child process, so you could simply create the asyncio.Lock in the child process without any loss of functionality.

However, if your actual use-case requires an asyncio-friendly lock that can also be shared between processes you can use aioprocessing for that (full disclosure: I am the author of aioprocessing).

  • Related