使用 MongoDB 作为 Celery 的消息队列

发布于 2024-10-14 11:38:42 字数 1121 浏览 1 评论 0原文

我正在尝试使用 MongoDB 作为 Celery 的消息队列(在 Django 应用程序中)。当前开发版本的 Celery (2.2.0rc2) 应该可以让你做到这一点,但我似乎无法让任何工作人员接手我正在创建的任务。

版本: 芹菜 v2.2.0rc3
mongodb 1.6.5
pymongo 1.9
django-celery 2.2.0rc2

在我的设置中,我有:

CELERY_RESULT_BACKEND = "mongodb"
CELERY_MONGODB_BACKEND_SETTINGS = {
    # Shouldn't need these - defaults are correct.
    "host": "localhost",
    "port": 27017,
    "database": "celery",
    "taskmeta_collection": "messages",
}

BROKER_BACKEND = 'mongodb'
BROKER_HOST = "localhost"
BROKER_PORT = 27017
BROKER_USER = ""
BROKER_PASSWORD = ""
BROKER_VHOST = ""

import djcelery
djcelery.setup_loader()

我创建了一个测试tasks.py 文件,如下所示:

from celery.decorators import task

@task()
def add(x, y):
    return x + y

如果我在后台启动 celeryd,它似乎会正常启动。然后我打开一个 python shell 并运行以下命令:

>>> from myapp.tasks import add
>>> result = add.delay(5,5)
>>> result
<AsyncResult: 7174368d-288b-4abe-a6d7-aeba987fa886>
>>> result.ready()
False

问题是没有工作人员接手任务。我是否缺少设置或其他什么?如何将 celery 指向消息队列?

I'm trying to use MongoDB as the message queue for Celery (in a Django app). The current development version of Celery (2.2.0rc2) is supposed to let you do this, but I can't seem to get any workers to pick up tasks I'm creating.

Versions:
celery v2.2.0rc3
mongodb 1.6.5
pymongo 1.9
django-celery 2.2.0rc2

In my settings, I have:

CELERY_RESULT_BACKEND = "mongodb"
CELERY_MONGODB_BACKEND_SETTINGS = {
    # Shouldn't need these - defaults are correct.
    "host": "localhost",
    "port": 27017,
    "database": "celery",
    "taskmeta_collection": "messages",
}

BROKER_BACKEND = 'mongodb'
BROKER_HOST = "localhost"
BROKER_PORT = 27017
BROKER_USER = ""
BROKER_PASSWORD = ""
BROKER_VHOST = ""

import djcelery
djcelery.setup_loader()

I've created a test tasks.py file as follows:

from celery.decorators import task

@task()
def add(x, y):
    return x + y

If I fire up celeryd in the background, it appears to start normally. I then open a python shell and run the following:

>>> from myapp.tasks import add
>>> result = add.delay(5,5)
>>> result
<AsyncResult: 7174368d-288b-4abe-a6d7-aeba987fa886>
>>> result.ready()
False

Problem is that no workers ever pick up the tasks. Am I missing a setting or something? How do I point celery to the message queue?

如果你对这篇内容有疑问,欢迎到本站社区发帖提问 参与讨论,获取更多帮助,或者扫码二维码加入 Web 技术交流群。

扫码二维码加入Web技术交流群

发布评论

需要 登录 才能够评论, 你可以免费 注册 一个本站的账号。

评论(4

生来就爱笑 2024-10-21 11:38:42

我们也遇到了同样的问题。虽然文档说所有任务都应该通过调用在 Celery 中注册,

import djcelery
djcelery.setup_loader()

但它无法正常工作。所以,我们还是使用

CELERY_IMPORTS = ('YOUR_APP.tasks',) 

settings.py中的设置。另外,如果添加新任务,请确保重新启动 Celery,因为 Celery 必须在首次启动时注册任务。

Django、Celerybeat 和Celery 以 MongoDB 作为代理

We had this same issue. While the doc says all tasks should be registered in Celery by calling

import djcelery
djcelery.setup_loader()

it wasn't working properly. So, we still used the

CELERY_IMPORTS = ('YOUR_APP.tasks',) 

setting in settings.py. Also, make sure you restart Celery if you add a new task because Celery has to register the tasks when it first starts.

Django, Celerybeat and Celery with MongoDB as the Broker

离鸿 2024-10-21 11:38:42

请记住,Kombu 仅适用于 mongo 1.3+,因为它需要 findandmodify 功能。
如果您使用的是 ubuntu,存储库中的最后一个版本是 1.2,则不起作用。

也许你还需要设置
BROKER_VHOST = "dbname"

如果有效请通知我

Remember that Kombu work only with mongo 1.3+ because it need the functionality findandmodify.
If you are on ubuntu the last version in repository is the 1.2, than doesn't work.

Maybe you have also to set
BROKER_VHOST = "dbname"

Keep me posted if it works

爱格式化 2024-10-21 11:38:42

请务必将其添加到您的设置中,否则工作人员将无法找到该任务并且会默默地失败。

CELERY_IMPORTS = ("namespace", )

Be sure to add this to your settings, or the workers can't find the task and will fail silently.

CELERY_IMPORTS = ("namespace", )
硬不硬你别怂 2024-10-21 11:38:42

我遇到了同样的问题,但是当我升级到 celery 2.3.3 时,一切都变得非常顺利。

I had the same issue but when I upgraded to celery 2.3.3 everything worked like a charm.

~没有更多了~
我们使用 Cookies 和其他技术来定制您的体验包括您的登录状态等。通过阅读我们的 隐私政策 了解更多相关信息。 单击 接受 或继续使用网站,即表示您同意使用 Cookies 和您的相关数据。
原文