尝试从异步AioHTTP返回2个值获取呼叫

发布于 2025-01-22 08:46:23 字数 1094 浏览 0 评论 0原文

今天,我试图加快脚本,并从另一个Stackoverflow帖子中找到了很棒的示例代码。基本上,我找到了一种使用AIOHTTP到Web而不是使用请求的方式来发出异步请求。这是该帖子的链接(我从Dragonbobz的答案中复制了代码)。

链接到其他stackoverflow帖子,我复制了代码

我正在尝试,我正在尝试解决问题。让它返回2个值(URL,响应),而不仅仅是提出的请求的响应。这是我采用的代码。

def async_aiohttp_get_all(urls, cookies):
    async def get_all(urls):
        async with aiohttp.ClientSession(cookies=cookies) as session:
            async def fetch(url):
                async with session.get(url) as response:
                    return await response.json()
            return await asyncio.gather(*[
                fetch(url) for url in urls
            ])

    return sync.async_to_sync(get_all)(urls)

for x in async_aiohttp_get_all(urls_list, s.cookies.get_dict()):
    print(x)

现在,我能够成功地从所有URL中收到所有URL的响应

return await response.json()

。 Python因此我什至无法搜索解决方案,因为没有任何意义。

return await url, response.json()
return await (url, response.json())

Today I was trying to speed up my script and found great example code from another stackoverflow post. Basically I found a way to make async requests using aiohttp to web instead of using requests. Here is the link to that post (I copied code from DragonBobZ's answer).

Link to other stackoverflow post from which I copied code

The issue is that I am trying to get it to return 2 values (url, response) instead of just the response from the request made. Here is the code I took.

def async_aiohttp_get_all(urls, cookies):
    async def get_all(urls):
        async with aiohttp.ClientSession(cookies=cookies) as session:
            async def fetch(url):
                async with session.get(url) as response:
                    return await response.json()
            return await asyncio.gather(*[
                fetch(url) for url in urls
            ])

    return sync.async_to_sync(get_all)(urls)

for x in async_aiohttp_get_all(urls_list, s.cookies.get_dict()):
    print(x)

Now I am successfully able to get responses from all urls within fraction of time it was taking with requests but I want the function to also return the url with:

return await response.json()

I tried this but nothing works and this is my first day to ever use async practices in python so I am not even able to search for a solution as nothing makes sense.

return await url, response.json()
return await (url, response.json())

如果你对这篇内容有疑问,欢迎到本站社区发帖提问 参与讨论,获取更多帮助,或者扫码二维码加入 Web 技术交流群。

扫码二维码加入Web技术交流群

发布评论

需要 登录 才能够评论, 你可以免费 注册 一个本站的账号。

评论(1

绮烟 2025-01-29 08:46:23

我无法准确地运行代码,但是我毫无问题地返回了一个元组。还删除了同步调用,因为asyncio为您提供了足够的灵活性。

import asyncio
import aiohttp

urls_list = [
    "https://www.google.com",
    "https://www.facebook.com",
    "https://www.twitter.com",
]

async def async_aiohttp_get_all(urls, cookies):
    async with aiohttp.ClientSession(cookies=cookies) as session:
        async def fetch(url):
            async with session.get(url) as response:
                return await response.text(), url
        return await asyncio.gather(*[
            fetch(url) for url in urls
        ])

results = asyncio.run(async_aiohttp_get_all(urls_list, None))
for res in results:
    print(res[0][:10], res[1])

输出:

<!doctype  https://www.google.com
<!DOCTYPE  https://www.facebook.com
<!DOCTYPE  https://www.twitter.com

因此,在您的情况下,<代码>返回等待响应。

I could not run the code exactly how you do, but I returned a tuple with no problem. Also removed the sync call, since asyncio gives you enough flexibility.

import asyncio
import aiohttp

urls_list = [
    "https://www.google.com",
    "https://www.facebook.com",
    "https://www.twitter.com",
]

async def async_aiohttp_get_all(urls, cookies):
    async with aiohttp.ClientSession(cookies=cookies) as session:
        async def fetch(url):
            async with session.get(url) as response:
                return await response.text(), url
        return await asyncio.gather(*[
            fetch(url) for url in urls
        ])

results = asyncio.run(async_aiohttp_get_all(urls_list, None))
for res in results:
    print(res[0][:10], res[1])

Output:

<!doctype  https://www.google.com
<!DOCTYPE  https://www.facebook.com
<!DOCTYPE  https://www.twitter.com

So, in your case, return await response.json(), url should work.

~没有更多了~
我们使用 Cookies 和其他技术来定制您的体验包括您的登录状态等。通过阅读我们的 隐私政策 了解更多相关信息。 单击 接受 或继续使用网站,即表示您同意使用 Cookies 和您的相关数据。
原文