Django 异步处理

发布于 2024-10-09 02:05:46 字数 468 浏览 5 评论 0原文

我有一堆 Django 请求,它们执行一些数学计算(用 C 编写并通过 Cython 模块执行),这可能需要不确定的时间(大约 1 秒)来执行。此外,请求不需要访问数据库,并且彼此独立且独立于 Django。

现在一切都是同步的(使用 Gunicorn 和 sync 工作类型),但我想让它异步和非阻塞。简而言之,我想做一些事情:

  1. 接收 AJAX 请求
  2. 将任务分配给可用的工作人员(不会阻塞主 Django Web 应用程序)
  3. 工作人员在未知的时间内执行任务
  4. Django 返回计算结果(字符串列表) )作为 JSON 每当任务完成时

我对异步 Django 非常陌生,所以我的问题是执行此操作的最佳堆栈是什么。

任务队列非常适合这种过程吗?有人会推荐 Tornado + Celery + RabbitMQ 或其他东西吗?

提前致谢!

I have a bunch of Django requests which executes some mathematical computations ( written in C and executed via a Cython module ) which may take an indeterminate amount ( on the order of 1 second ) of time to execute. Also the requests don't need to access the database and are all independent of each other and Django.

Right now everything is synchronous ( using Gunicorn with sync worker types ) but I'd like to make this asynchronous and nonblocking. In short I'd like to do something:

  1. Receive the AJAX request
  2. Allocate task to an available worker ( without blocking the main Django web application )
  3. Worker executes task in some unknown amount of time
  4. Django returns the result of the computation (a list of strings) as JSON whenever the task completes

I am very new to asynchronous Django, and so my question is what is the best stack for doing this.

Is this sort of process something a task queue is well suited for? Would anyone recommend Tornado + Celery + RabbitMQ, or perhaps something else?

Thanks in advance!

如果你对这篇内容有疑问,欢迎到本站社区发帖提问 参与讨论,获取更多帮助,或者扫码二维码加入 Web 技术交流群。

扫码二维码加入Web技术交流群

发布评论

需要 登录 才能够评论, 你可以免费 注册 一个本站的账号。

评论(2

长伴 2024-10-16 02:05:46

芹菜非常适合这个。

由于您正在做的事情相对简单(请阅读:您不需要关于如何路由任务的复杂规则),您可能可以使用 Redis 后端,这意味着您不需要设置/配置 RabbitMQ (根据我的经验,这更困难)。

我在 Celery 的开发版本中使用 Redis 最多,以下是我的配置的相关部分:

# Use redis as a queue
BROKER_BACKEND = "kombu.transport.pyredis.Transport"
BROKER_HOST = "localhost"
BROKER_PORT = 6379
BROKER_VHOST = "0"

# Store results in redis
CELERY_RESULT_BACKEND = "redis"
REDIS_HOST = "localhost"
REDIS_PORT = 6379
REDIS_DB = "0"

我还使用 django-celery,这使得与 Django 的集成很顺利。

如果您需要任何更具体的建议,请发表评论。

Celery would be perfect for this.

Since what you're doing is relatively simple (read: you don't need complex rules about how tasks should be routed), you could probably get away with using the Redis backend, which means you don't need to setup/configure RabbitMQ (which, in my experience, is more difficult).

I use Redis with the most a dev build of Celery, and here are the relevant bits of my config:

# Use redis as a queue
BROKER_BACKEND = "kombu.transport.pyredis.Transport"
BROKER_HOST = "localhost"
BROKER_PORT = 6379
BROKER_VHOST = "0"

# Store results in redis
CELERY_RESULT_BACKEND = "redis"
REDIS_HOST = "localhost"
REDIS_PORT = 6379
REDIS_DB = "0"

I'm also using django-celery, which makes the integration with Django happy.

Comment if you need any more specific advice.

从﹋此江山别 2024-10-16 02:05:46

由于您计划使其异步(可能使用 gevent 之类的东西),因此您还可以考虑为计算工作创建线程/分叉后端 Web 服务。

异步前端服务器可以处理所有轻量工作,从适合异步的数据库(带有特殊驱动程序的redis或mysql)等获取数据。当必须完成计算时,前端服务器可以将所有输入数据发布到后端服务器并在后端服务器完成计算后检索结果。

由于前端服务器是异步的,因此在等待结果时不会阻塞。与使用 celery 相比,这样做的优点是,您可以在结果可用时立即将结果返回给客户端。

client browser <> async frontend server <> backend server for computations

Since you are planning to make it async (presumably using something like gevent), you could also consider making a threaded/forked backend web service for the computational work.

The async frontend server could handle all the light work, get data from databases that are suitable for async (redis or mysql with a special driver), etc. When a computation has to be done, the frontend server can post all input data to the backend server and retrieve the result when the backend server is done computing it.

Since the frontend server is async, it will not block while waiting for the results. The advantage of this as opposed to using celery, is that you can return the result to the client as soon as it becomes available.

client browser <> async frontend server <> backend server for computations
~没有更多了~
我们使用 Cookies 和其他技术来定制您的体验包括您的登录状态等。通过阅读我们的 隐私政策 了解更多相关信息。 单击 接受 或继续使用网站,即表示您同意使用 Cookies 和您的相关数据。
原文