如何在每批任务完成后使异步信号量睡眠/延迟?

发布于 2025-01-11 15:18:37 字数 702 浏览 0 评论 0原文

考虑以下部分脚本,该脚本检查 URL 列表:

async def checker(url, session: ClientSession, sem: asyncio.Semaphore):
    
    async with sem:
        async with requests.get(url) as r
            return r.real_url, r.

async def asyncStarter(url_listy):
    listy = []
    
    async with ClientSession() as session:
        sem = asyncio.Semaphore(35)
        listy = await asyncProgress.gather(*(checker(u, session, sem) for a in url_listy)) 
    return listy

信号量值设置为 35。这意味着在任何时候都有 35 个并发调用正在进行。

我希望 Python 在每次调用完成后基本上执行 time.sleep() 。举个例子,我怎样才能让每检查 100 个 URL 后,Python 等待 15 秒?然后当检查另外 100 个 URL 时,再等待 15 秒,依此类推?我正在查看 asyncio 和 time.sleep() 的文档,但我不知道如何使其定期。我也愿意每 N 秒延迟一次,这也很酷。

Consider the following partial script, which checks a list of URLs:

async def checker(url, session: ClientSession, sem: asyncio.Semaphore):
    
    async with sem:
        async with requests.get(url) as r
            return r.real_url, r.

async def asyncStarter(url_listy):
    listy = []
    
    async with ClientSession() as session:
        sem = asyncio.Semaphore(35)
        listy = await asyncProgress.gather(*(checker(u, session, sem) for a in url_listy)) 
    return listy

The Semaphore value is set at 35. This means at any time, 35 concurrent calls are being done.

I want Python to basically perform time.sleep() after every number of calls are finished. So for example, how can I make it here so that after every 100 URLs checked, Python waits 15 seconds? Then when another 100 URLs are checked, wait another 15 seconds, and so on? I'm looking at the docs for both asyncio and time.sleep(), and I can't figure out how to make this periodic. I'm also open to a delay every N seconds, that's cool too.

如果你对这篇内容有疑问,欢迎到本站社区发帖提问 参与讨论,获取更多帮助,或者扫码二维码加入 Web 技术交流群。

扫码二维码加入Web技术交流群

发布评论

需要 登录 才能够评论, 你可以免费 注册 一个本站的账号。
列表为空,暂无数据
我们使用 Cookies 和其他技术来定制您的体验包括您的登录状态等。通过阅读我们的 隐私政策 了解更多相关信息。 单击 接受 或继续使用网站,即表示您同意使用 Cookies 和您的相关数据。
原文