将所有 celery 任务的日志消息发送到单个文件

发布于 2024-11-10 13:59:30 字数 284 浏览 6 评论 0原文

我想知道如何设置更具体的日志系统。我的所有任务都用作

logger = logging.getLogger(__name__)

模块范围的记录器。

我希望芹菜记录到“celeryd.log”,我的任务记录到“tasks.log”,但我不知道如何让它工作。使用 django-celery 中的 CELERYD_LOG_FILE 我可以将所有 celeryd 相关的日志消息路由到 celeryd.log,但没有在我的任务中创建的日志消息的踪迹。

I'm wondering how to setup a more specific logging system. All my tasks use

logger = logging.getLogger(__name__)

as a module-wide logger.

I want celery to log to "celeryd.log" and my tasks to "tasks.log" but I got no idea how to get this working. Using CELERYD_LOG_FILE from django-celery I can route all celeryd related log messages to celeryd.log but there is no trace of the log messages created in my tasks.

如果你对这篇内容有疑问,欢迎到本站社区发帖提问 参与讨论,获取更多帮助,或者扫码二维码加入 Web 技术交流群。

扫码二维码加入Web技术交流群

发布评论

需要 登录 才能够评论, 你可以免费 注册 一个本站的账号。

评论(3

最终幸福 2024-11-17 13:59:30

注意:这个答案从 Celery 3.0 开始已经过时,您现在使用 get_task_logger() 设置每个任务的记录器。请参阅《Celery 3.0 新增功能》文档的“日志记录”部分 了解详细信息。


Celery 对每个任务的日志记录都有专门的支持。请参阅有关该主题的任务文档

您可以使用工作日志记录器将诊断输出添加到工作日志中:

@celery.task()
def 添加(x,y):
    记录器 = add.get_logger()
    logger.info("添加 %s + %s" % (x, y))
    返回 x + y

有几个可用的日志级别,由工人日志级别设置决定
是否将它们写入日志文件。

当然,您也可以简单地使用 print,因为写入标准输出/-err 的任何内容都会
也写入日志文件。

在底层,这仍然是标准的 python 日志模块。您可以设置 CELERYD_HIJACK_ROOT_LOGGER 选项 设置为 False 以允许您自己的日志记录设置正常工作,否则 Celery 将为您配置处理。

但是,对于任务,.get_logger() 调用确实允许您为每个单独的任务设置单独的日志文件。只需传入 logfile 参数,它就会将日志消息路由到该单独的文件:

@celery.task()
def add(x, y):
    logger = add.get_logger(logfile='tasks.log')
    logger.info("Adding %s + %s" % (x, y))
    return x + y 

最后但并非最不重要的一点是,您只需在 python 日志记录模块 并给它一个自己的文件处理程序。我使用 celery.signals.after_setup_task_logger 信号进行设置;在这里,我假设您的所有模块都位于名为 foo.tasks 的包中(如 foo.tasks.email 和 foo.tasks.scaling 中所示) :

from celery.signals import after_setup_task_logger
import logging

def foo_tasks_setup_logging(**kw):
    logger = logging.getLogger('foo.tasks')
    if not logger.handlers:
        handler = logging.FileHandler('tasks.log')
        formatter = logging.Formatter(logging.BASIC_FORMAT) # you may want to customize this.
        handler.setFormatter(formatter)
        logger.addHandler(handler)
        logger.propagate = False

after_setup_task_logger.connect(foo_tasks_setup_logging)

现在任何名称以 foo.tasks 开头的记录器都会将其所有消息发送到 tasks.log 而不是发送到根记录器(根记录器看不到任何这些消息)消息因为.propagate 为 False)。

Note: This answer is outdated as of Celery 3.0, where you now use get_task_logger() to get your per-task logger set up. Please see the Logging section of the What's new in Celery 3.0 document for details.


Celery has dedicated support for logging, per task. See the Task documentation on the subject:

You can use the workers logger to add diagnostic output to the worker log:

@celery.task()
def add(x, y):
    logger = add.get_logger()
    logger.info("Adding %s + %s" % (x, y))
    return x + y

There are several logging levels available, and the workers loglevel setting decides
whether or not they will be written to the log file.

Of course, you can also simply use print as anything written to standard out/-err will be
written to the log file as well.

Under the hood this is all still the standard python logging module. You can set the CELERYD_HIJACK_ROOT_LOGGER option to False to allow your own logging setup to work, otherwise Celery will configure the handling for you.

However, for tasks, the .get_logger() call does allow you to set up a separate log file per individual task. Simply pass in a logfile argument and it'll route log messages to that separate file:

@celery.task()
def add(x, y):
    logger = add.get_logger(logfile='tasks.log')
    logger.info("Adding %s + %s" % (x, y))
    return x + y 

Last but not least, you can just configure your top-level package in the python logging module and give it a file handler of it's own. I'd set this up using the celery.signals.after_setup_task_logger signal; here I assume all your modules live in a package called foo.tasks (as in foo.tasks.email and foo.tasks.scaling):

from celery.signals import after_setup_task_logger
import logging

def foo_tasks_setup_logging(**kw):
    logger = logging.getLogger('foo.tasks')
    if not logger.handlers:
        handler = logging.FileHandler('tasks.log')
        formatter = logging.Formatter(logging.BASIC_FORMAT) # you may want to customize this.
        handler.setFormatter(formatter)
        logger.addHandler(handler)
        logger.propagate = False

after_setup_task_logger.connect(foo_tasks_setup_logging)

Now any logger whose name starts with foo.tasks will have all it's messages sent to tasks.log instead of to the root logger (which doesn't see any of these messages because .propagate is False).

苯莒 2024-11-17 13:59:30

只是提示:Celery 有自己的日志处理程序:

from celery.utils.log import get_task_logger
logger = get_task_logger(__name__)

此外,Celery 会记录任务的所有输出。更多详细信息,请访问 用于任务日志记录的 Celery 文档

Just a hint: Celery has its own logging handler:

from celery.utils.log import get_task_logger
logger = get_task_logger(__name__)

Also, Celery logs all output from the task. More details at Celery docs for Task Logging

梦在夏天 2024-11-17 13:59:30

加入
--concurrency=1 --loglevel=INFO
使用运行 celery worker 的命令,

例如: python xxxx.py celery worker --concurrency=1 --loglevel=INFO

最好也在每个 python 文件中设置日志级别

join
--concurrency=1 --loglevel=INFO
with the command to run celery worker

eg: python xxxx.py celery worker --concurrency=1 --loglevel=INFO

Better to set loglevel inside each python files too

~没有更多了~
我们使用 Cookies 和其他技术来定制您的体验包括您的登录状态等。通过阅读我们的 隐私政策 了解更多相关信息。 单击 接受 或继续使用网站,即表示您同意使用 Cookies 和您的相关数据。
原文