Celery - 安排从特定时间开始的周期性任务

发布于 2024-12-11 05:09:56 字数 406 浏览 0 评论 0原文

安排从特定日期时间开始的定期任务的最佳方法是什么?

(考虑到我需要安排大约一百个远程 rsync,我没有使用 cron, 我计算远程与本地偏移量,并且需要在每台主机中生成日志的时同步每个路径。)

据我了解,celery.task.schedules crontab 类仅允许指定小时、分钟,星期几。 到目前为止,我发现的最有用的提示是 nosklo 的回答

这是最好的解决方案吗? 我是否使用了错误的工具来完成这项工作?

What is the best way to schedule a periodic task starting at specific datetime?

(I'm not using cron for this considering I've the need to schedule about a hundred remote rsyncs,
where I compute the remote vs local offset and would need to rsync each path the second the logs are generated in each host.)

By my understanding the celery.task.schedules crontab class only allows specifying hour, minute, day of week.
The most useful tip I've found so far was this answer by nosklo.

Is this the best solution?
Am I using the wrong tool for the job?

如果你对这篇内容有疑问,欢迎到本站社区发帖提问 参与讨论,获取更多帮助,或者扫码二维码加入 Web 技术交流群。

扫码二维码加入Web技术交流群

发布评论

需要 登录 才能够评论, 你可以免费 注册 一个本站的账号。

评论(2

尴尬癌患者 2024-12-18 05:09:56

Celery 似乎是解决调度问题的一个很好的解决方案:Celery 的周期性任务的运行时间分辨率以秒为单位。

您在这里使用了适当的工具,但 crontab 条目不是您想要的。你想使用 python 的 datetime.timedelta 对象; celery.schedules 中的 crontab 调度程序只有分钟分辨率,但使用 timedelta 来配置周期性任务间隔提供了严格的更多功能,在本例中为每秒分辨率。

例如来自 Celery 文档

>>> from celery.task import tasks, PeriodicTask
>>> from datetime import timedelta
>>> class EveryThirtySecondsTask(PeriodicTask):
...     run_every = timedelta(seconds=30)
...
...     def run(self, **kwargs):
...         logger = self.get_logger(**kwargs)
...         logger.info("Execute every 30 seconds")

http://ask.github.com /celery/reference/celery.task.base.html#celery.task.base.PeriodicTask

class datetime.timedelta(days=0, seconds=0, microseconds=0, milliseconds=0, minutes=0, hours=0, weeks=0)

这里唯一的挑战是您必须描述您希望此任务运行的频率,而不是在什么时钟时间你希望它运行;但是,我建议您查看高级 Python Scheduler http://packages.python.org/APScheduler/

它看起来像高级 Python Scheduler 可轻松用于使用其自己的调度功能按您选择的任何时间表启动正常(即非定期)Celery 任务。

Celery seems like a good solution for your scheduling problem: Celery's PeriodicTasks have run time resolution in seconds.

You're using an appropriate tool here, but the crontab entry is not what you want. You want to use python's datetime.timedelta object; the crontab scheduler in celery.schedules has only minute resolution, but using timedelta's to configure the PeriodicTask interval provides strictly more functionality, in this case, per second resolution.

e.g. from the Celery docs

>>> from celery.task import tasks, PeriodicTask
>>> from datetime import timedelta
>>> class EveryThirtySecondsTask(PeriodicTask):
...     run_every = timedelta(seconds=30)
...
...     def run(self, **kwargs):
...         logger = self.get_logger(**kwargs)
...         logger.info("Execute every 30 seconds")

http://ask.github.com/celery/reference/celery.task.base.html#celery.task.base.PeriodicTask

class datetime.timedelta(days=0, seconds=0, microseconds=0, milliseconds=0, minutes=0, hours=0, weeks=0)

The only challenge here is that you have to describe the frequency with which you want this task to run rather than at what clock time you want it to run; however, I would suggest you check out the Advanced Python Scheduler http://packages.python.org/APScheduler/

It looks like Advanced Python Scheduler could easily be used to launch normal (i.e. non Periodic) Celery tasks at any schedule of your choosing using it's own scheduling functionality.

八巷 2024-12-18 05:09:56

我最近在做一个涉及Celery的任务,我必须使用它来进行异步操作以及计划任务。可以说,我还是求助于旧的 crontab 来执行计划任务,尽管它调用了一个 python 脚本来生成一个单独的异步任务。这样我就可以减少 crontab 的维护工作(为了让 Celery 调度程序在那里运行,需要一些进一步的设置),但我充分利用了 Celery 的异步功能。

I've recently worked on a task that involved Celery, and I had to use it for asynchronous operation as well as scheduled tasks. Suffice to say I resorted back to the old crontab for the scheduled task, although it calls a python script that spawns a separate asynchronous task. This way I have less to maintain for the crontab (to make the Celery scheduler run there needs some further setup), but I am making full use of Celery's asynchronous capabilities.

~没有更多了~
我们使用 Cookies 和其他技术来定制您的体验包括您的登录状态等。通过阅读我们的 隐私政策 了解更多相关信息。 单击 接受 或继续使用网站,即表示您同意使用 Cookies 和您的相关数据。
原文