celery 并发数是否等于一次可以执行的任务数?
我了解您应该将工作人员计算为您的节点的核心数,如果您超越了此核心,则可能会压倒节点。
我有数百个Web请求(每个Web请求)需要每分钟,目前通过apply_async()
对所有这些请求。
如果我设置并发-c 10
,这是否意味着一次只能执行其中10个请求?还是并发计数不一定等于可以一次执行的任务数量?
在大部分时间都花在等待完成的请求时,只能处理10个请求,这将是浪费资源,并且只能处理10个请求。当我发现有关混合 asyncio 和芹菜的文章时,人们似乎认为这不是一个好主意。那么这里的解决方案是什么?芹菜是错误的移动,或者做10个并发≠仅10个同时任务。
I understand that you should essentially set your worker count to the number of cores your node has, and if you go beyond that, you'll probably overwhelm the node.
I have hundreds of web requests (each as their own task) that will need to every minute, currently routing all of these through apply_async()
.
If I set my concurrency -c 10
, does that mean it can only execute up to 10 of those requests at a time? Or is the concurrency count not necessarily equal to the amount of tasks it can execute at once?
It would be a waste of resources and wildly inefficient to only handle 10 requests at a time when most of that time is spent just waiting on the request to finish. When I find articles on mixing asyncio and Celery, people seem to think it's not a great idea. So what would be the solution here? Was Celery the wrong move, or does 10 concurrency ≠ only 10 simultaneous tasks.
如果你对这篇内容有疑问,欢迎到本站社区发帖提问 参与讨论,获取更多帮助,或者扫码二维码加入 Web 技术交流群。

绑定邮箱获取回复消息
由于您还没有绑定你的真实邮箱,如果其他用户或者作者回复了您的评论,将不能在第一时间通知您!
发布评论
评论(1)
我使用了错误的方法。我应该使用 Celery 采用基于线程的方法: https://www.technoarchsoftwares.com/blog/optimization-using-celery-asynchronous-processing-in-django/
切换到 gevent,它的工作原理与我现在已经想象过了。
I was using the wrong approach. I should've gone with a thread-based approach using Celery: https://www.technoarchsoftwares.com/blog/optimization-using-celery-asynchronous-processing-in-django/
Switched to gevent and it's working exactly as I'd imagined now.