Home > Software engineering >  Trying to get 2 values returned from async aiohttp get call
Trying to get 2 values returned from async aiohttp get call

Time:04-20

Today I was trying to speed up my script and found great example code from another stackoverflow post. Basically I found a way to make async requests using aiohttp to web instead of using requests. Here is the link to that post (I copied code from DragonBobZ's answer).

Link to other stackoverflow post from which I copied code

The issue is that I am trying to get it to return 2 values (url, response) instead of just the response from the request made. Here is the code I took.

def async_aiohttp_get_all(urls, cookies):
    async def get_all(urls):
        async with aiohttp.ClientSession(cookies=cookies) as session:
            async def fetch(url):
                async with session.get(url) as response:
                    return await response.json()
            return await asyncio.gather(*[
                fetch(url) for url in urls
            ])

    return sync.async_to_sync(get_all)(urls)

for x in async_aiohttp_get_all(urls_list, s.cookies.get_dict()):
    print(x)

Now I am successfully able to get responses from all urls within fraction of time it was taking with requests but I want the function to also return the url with:

return await response.json()

I tried this but nothing works and this is my first day to ever use async practices in python so I am not even able to search for a solution as nothing makes sense.

return await url, response.json()
return await (url, response.json())

CodePudding user response:

I could not run the code exactly how you do, but I returned a tuple with no problem. Also removed the sync call, since asyncio gives you enough flexibility.

import asyncio
import aiohttp

urls_list = [
    "https://www.google.com",
    "https://www.facebook.com",
    "https://www.twitter.com",
]

async def async_aiohttp_get_all(urls, cookies):
    async with aiohttp.ClientSession(cookies=cookies) as session:
        async def fetch(url):
            async with session.get(url) as response:
                return await response.text(), url
        return await asyncio.gather(*[
            fetch(url) for url in urls
        ])

results = asyncio.run(async_aiohttp_get_all(urls_list, None))
for res in results:
    print(res[0][:10], res[1])

Output:

<!doctype  https://www.google.com
<!DOCTYPE  https://www.facebook.com
<!DOCTYPE  https://www.twitter.com

So, in your case, return await response.json(), url should work.

  • Related