Home > Mobile >  python for each run async function without await and parallel
python for each run async function without await and parallel

Time:12-08

I have 10 links in my CSV which I'm trying to run all at the same time in a loop from getTasks function. However, the way it's working now, it send a request to link 1, waits for it to complete, then link 2, etc, etc. I want the 10 links that I have to run all whenever startTask is called, leading to 10 requests a second.

Anyone know how to code that using the code below? Thanks in advance.


import requests
from bs4 import BeautifulSoup
import asyncio

def getTasks(tasks):
    for task in tasks:
      asyncio.run(startTask(task))


async def startTask(task):
    
    success = await getProduct(task)
    if success is None:
    return startTask(task)

    success = await addToCart(task)
    if success is None:
    return startTask(task)

    ...
    ...
    ...

getTasks(tasks)

CodePudding user response:

First of all, to make your requests sent concurrently, you should use the aiohttp instead of the requests package that blocks I/O. And use the asyncio's semaphore to limit the count of concurrent processes at the same time.

import asyncio
import aiohttp

# read links from CSV
links = [
    ...
]

semaphore = asyncio.BoundedSemaphore(10) 
# 10 is the max count of concurrent tasks
# that can be processed at the same time.
# In this case, tasks are requests.

async def async_request(url):
    async with aiohttp.ClientSession() as session:
        async with semaphore, session.get(url) as response:
            return await response.text()


async def main():
    result = await asyncio.gather(*[
        async_request(link) for link in links
    ])
    print(result)  # [response1, response2, ...]


if __name__ == "__main__":
    loop = asyncio.get_event_loop()
    loop.run_until_complete(main())
    loop.close()
  • Related