如何在每批任务完成后使异步信号量睡眠/延迟?
考虑以下部分脚本,该脚本检查 URL 列表:
async def checker(url, session: ClientSession, sem: asyncio.Semaphore):
async with sem:
async with requests.get(url) as r
return r.real_url, r.
async def asyncStarter(url_listy):
listy = []
async with ClientSession() as session:
sem = asyncio.Semaphore(35)
listy = await asyncProgress.gather(*(checker(u, session, sem) for a in url_listy))
return listy
信号量值设置为 35。这意味着在任何时候都有 35 个并发调用正在进行。
我希望 Python 在每次调用完成后基本上执行 time.sleep() 。举个例子,我怎样才能让每检查 100 个 URL 后,Python 等待 15 秒?然后当检查另外 100 个 URL 时,再等待 15 秒,依此类推?我正在查看 asyncio 和 time.sleep() 的文档,但我不知道如何使其定期。我也愿意每 N 秒延迟一次,这也很酷。
Consider the following partial script, which checks a list of URLs:
async def checker(url, session: ClientSession, sem: asyncio.Semaphore):
async with sem:
async with requests.get(url) as r
return r.real_url, r.
async def asyncStarter(url_listy):
listy = []
async with ClientSession() as session:
sem = asyncio.Semaphore(35)
listy = await asyncProgress.gather(*(checker(u, session, sem) for a in url_listy))
return listy
The Semaphore value is set at 35. This means at any time, 35 concurrent calls are being done.
I want Python to basically perform time.sleep()
after every number of calls are finished. So for example, how can I make it here so that after every 100 URLs checked, Python waits 15 seconds? Then when another 100 URLs are checked, wait another 15 seconds, and so on? I'm looking at the docs for both asyncio and time.sleep()
, and I can't figure out how to make this periodic. I'm also open to a delay every N seconds, that's cool too.
如果你对这篇内容有疑问,欢迎到本站社区发帖提问 参与讨论,获取更多帮助,或者扫码二维码加入 Web 技术交流群。
绑定邮箱获取回复消息
由于您还没有绑定你的真实邮箱,如果其他用户或者作者回复了您的评论,将不能在第一时间通知您!
发布评论