向现有的 celery 工作进程动态添加函数?
我正在开始使用 celery,我想知道是否可以将模块添加到已经启动的 celeryd 进程中。换句话说,
CELERY_IMPORTS = ("tasks", "additional_module" )
我不想像在启动工作进程之前那样通过 celeryconfig.py 添加模块,而是希望在工作进程启动之后以某种方式使 extra_module 可用。
提前致谢。
I'm getting started with celery and I want to know if it is possible to add modules to celeryd processes that have already been started. In other words, instead of adding modules via celeryconfig.py as in
CELERY_IMPORTS = ("tasks", "additional_module" )
before starting the workers, I want to make additional_module available later somehow after the worker processes have started.
thanks in advance.
如果你对这篇内容有疑问,欢迎到本站社区发帖提问 参与讨论,获取更多帮助,或者扫码二维码加入 Web 技术交流群。

绑定邮箱获取回复消息
由于您还没有绑定你的真实邮箱,如果其他用户或者作者回复了您的评论,将不能在第一时间通知您!
发布评论
评论(1)
您可以通过启动一个具有扩展导入列表的新 celeryd 来实现您的目标,并最终优雅地关闭您的旧工作人员(在完成当前工作之后)。
由于将作业推送给您并仅在 celery 完成其工作后将其标记为完成的异步性质,因此您实际上不会错过任何这样做的工作。您应该能够在同一台机器上运行 celery 工作程序 - 它们将简单地显示为与 RabbitMQ(或您使用的任何队列后端)的新连接。
You can achieve your goal by starting a new celeryd with an expanded import list and eventually gracefully shutting down your old worker (after it's finished its current jobs).
Because of the asynchronous nature of getting jobs pushed to you and only marking them done after celery has finished its work, you won't actually miss any work doing it this way. You should be able to run the celery workers on the same machine - they'll simply show up as new connections to RabbitMQ (or whatever queue backend you use).