Python-同时运行多个异步功能

发布于 2025-01-18 11:39:51 字数 864 浏览 1 评论 0原文

我本质上是在制作一个 pinger,它具有一个包含密钥/Webhook 对的二维列表,在 ping 一个密钥后,将响应发送到一个 webhook,

该二维列表如下所示:

some_list = [["key1", "webhook1"], ["key2", "webhook2"]]

我的程序本质上是一个循环,并且我'我不太确定如何在函数中旋转 some_list 数据。

这是我的脚本的一个小演示:

async def do_ping(some_pair):
    async with aiohttp.ClientSession() as s:
        tasks = await gen_tasks(s, some_pair)
        results = await asyncio.gather(*tasks*)
        sleep(10)
        await do_ping(some_pair)

我已经尝试过:

async def main(): 
    for entry in some_list: 
        asyncio.run(do_ping(entry))

但是由于 do_ping 函数是一个自调用循环,它只是一遍又一遍地调用第一个函数,并且永远不会得到给它之后的人。希望找到解决方案,无论是线程还是类似的,如果您有更好的方法来构造 some_list 值(我认为这将是一个字典),请随时放弃该反馈

I'm essentially making a pinger, that makes has a 2d list, of key / webhook pairs, and after pinging a key, send the response to a webhook

the 2d list goes as follows:

some_list = [["key1", "webhook1"], ["key2", "webhook2"]]

My program is essentially a loop, and I'm not too sure how I can rotate the some_list data, in the function.

Here's a little demo of what my script looks like:

async def do_ping(some_pair):
    async with aiohttp.ClientSession() as s:
        tasks = await gen_tasks(s, some_pair)
        results = await asyncio.gather(*tasks*)
        sleep(10)
        await do_ping(some_pair)

I've tried:

async def main(): 
    for entry in some_list: 
        asyncio.run(do_ping(entry))

but due to the do_ping function being a self-calling loop, it just calls the first one over and over again, and never gets to the ones after it. Hoping to find a solution to this, whether it's threading or alike, and if you have a better way of structuring some_list values (which I assume would be a dictionary), feel free to drop that feedback as well

如果你对这篇内容有疑问,欢迎到本站社区发帖提问 参与讨论,获取更多帮助,或者扫码二维码加入 Web 技术交流群。

扫码二维码加入Web技术交流群

发布评论

需要 登录 才能够评论, 你可以免费 注册 一个本站的账号。

评论(1

盛夏已如深秋| 2025-01-25 11:39:51

您使方法递归 await do_ping(some_pair),它永远不会结束 main 中的循环继续。我会像这样重构应用程序:

async def do_ping(some_pair):
    async with aiohttp.ClientSession() as s:
        while True:
            tasks = await gen_tasks(s, some_pair)
            results = await asyncio.gather(*tasks)
            await asyncio.sleep(10)


async def main(): 
    tasks = [do_ping(entry) for entry in some_list]
    await asyncio.gather(*tasks)


if __name__ == "__main__":
    asyncio.run(main())

或者,您可以将重复和睡眠逻辑移至 main 中:

async def do_ping(some_pair):
    async with aiohttp.ClientSession() as s:
        tasks = await gen_tasks(s, some_pair)
        results = await asyncio.gather(*tasks)


async def main(): 
    while True:
        tasks = [do_ping(entry) for entry in some_list]
        await asyncio.gather(*tasks)
        await asyncio.sleep(10)


if __name__ == "__main__":
    asyncio.run(main())

您还可以在调用睡眠之前启动任务,然后收集它们。这将使 ping 更一致地以 10 秒间隔开始,而不是 10 秒 + 收集结果所需的时间:

async def main(): 
    while True:
        tasks = [
            asyncio.create_task(do_ping(entry))
            for entry in some_list
        ]
        await asyncio.sleep(10)
        await asyncio.wait(tasks)

编辑正如 creolo 您应该只创建一个 ClientSession 对象。请参阅https://docs.aiohttp.org/en/stable/client_reference.html

Session封装了一个连接池(连接器实例),默认支持keepalive。除非您在应用程序的生命周期内连接到大量未知数量的不同服务器,否则建议您在应用程序的生命周期内使用单个会话,以便从连接池中受益。

async def do_ping(session, some_pair):
    tasks = await gen_tasks(session, some_pair)
    results = await asyncio.gather(*tasks)

async def main(): 
    async with aiohttp.ClientSession() as session:
        while True:
            tasks = [
                asyncio.create_task(do_ping(session, entry))
                for entry in some_list
            ]
            await asyncio.sleep(10)
            await asyncio.wait(tasks)

You made your method recursive await do_ping(some_pair), it never ends for the loop in main to continue. I would restructure the application like this:

async def do_ping(some_pair):
    async with aiohttp.ClientSession() as s:
        while True:
            tasks = await gen_tasks(s, some_pair)
            results = await asyncio.gather(*tasks)
            await asyncio.sleep(10)


async def main(): 
    tasks = [do_ping(entry) for entry in some_list]
    await asyncio.gather(*tasks)


if __name__ == "__main__":
    asyncio.run(main())

Alternatively you could move the repeat and sleeping logic into the main:

async def do_ping(some_pair):
    async with aiohttp.ClientSession() as s:
        tasks = await gen_tasks(s, some_pair)
        results = await asyncio.gather(*tasks)


async def main(): 
    while True:
        tasks = [do_ping(entry) for entry in some_list]
        await asyncio.gather(*tasks)
        await asyncio.sleep(10)


if __name__ == "__main__":
    asyncio.run(main())

You could also start the tasks before doing a call to sleep, and gather them afterwards. That would make the pings more consistently start at 10 second intervals instead of being 10 seconds + the time it takes to gather the results:

async def main(): 
    while True:
        tasks = [
            asyncio.create_task(do_ping(entry))
            for entry in some_list
        ]
        await asyncio.sleep(10)
        await asyncio.wait(tasks)

EDIT As pointed out by creolo you should only create a single ClientSession object. See https://docs.aiohttp.org/en/stable/client_reference.html

Session encapsulates a connection pool (connector instance) and supports keepalives by default. Unless you are connecting to a large, unknown number of different servers over the lifetime of your application, it is suggested you use a single session for the lifetime of your application to benefit from connection pooling.

async def do_ping(session, some_pair):
    tasks = await gen_tasks(session, some_pair)
    results = await asyncio.gather(*tasks)

async def main(): 
    async with aiohttp.ClientSession() as session:
        while True:
            tasks = [
                asyncio.create_task(do_ping(session, entry))
                for entry in some_list
            ]
            await asyncio.sleep(10)
            await asyncio.wait(tasks)
~没有更多了~
我们使用 Cookies 和其他技术来定制您的体验包括您的登录状态等。通过阅读我们的 隐私政策 了解更多相关信息。 单击 接受 或继续使用网站,即表示您同意使用 Cookies 和您的相关数据。
原文