假设我有 2 台服务器。
第一个是提供一些计算的服务,该计算可以持续很长时间(几分钟到几小时)。
第二个服务器将使用此服务来计算一些数据。
我正在尝试为第一台服务器设计一个 REST API,到目前为止一切顺利。但我想听听一些关于如何在长期任务完成后对通知进行建模的意见。
到目前为止,我考虑了两种方法:
- 轮询 - 第二个服务器会时不时地询问结果。
- 回调 - 第二个服务器将为第一个服务器设置一个 uri,以便在完成后调用。但这在 REST API 中有点味道。
你怎么认为?
Suppose I have 2 servers.
The first is a service that provides some computations, which can last long time (minutes to hours).
The second server will use this service to have some data computed.
I'm trying to design a REST API for the first server and so far so good. But I'd like to hear some opinion on how to model notifications when the long lasting task is finished.
I considered 2 approaches so far:
- Polling - the second server will ask every now and then about the result.
- Callback - Second server will setup an uri for the first one to call after it is done. But this smells a bit in REST API.
What do you think?
发布评论
评论(3)
对于你的情况我会选择投票。当第二台服务器发出在第一台服务器上创建作业的初始请求时,它应该收到包含最终状态页面 url 的响应。然后,第二个服务器每 5-15 分钟轮询一次该 URL 以检查作业的状态。如果第一个服务器将该 URL 设为 RSS 或 Atom 提要,则用户还可以将其 RSS 阅读器指向同一 URL,并自行了解工作是否已完成。当人和机器都可以从单一来源获取信息时,这才是真正的胜利。
For your situation I would choose polling. When the second server makes the initial request to create the job on the first server, it should get a response that has the url of the eventual status page. The second server then polls that url every 5-15 minutes to check the status of the job. If the first server makes that url an RSS or Atom feed, then users could also point their RSS readers at the same url and find out themselves if the job is done. It's a real win when both people and machines can get information out of a single source.
除了我在已经回答的这个类似的问题,我建议使用 Atom 发布协议进行通知(您可以发布到您的第二个服务器)。
In addition to what I've already answered in this similar question, I'd suggest using the Atom Publishing Protocol for the notification (you could publish to your second server).
如果您使用 Python,则可以利用 RabbitMQ 和 Celery 来完成这项工作。 Celery 允许您在队列中创建一个项目,然后暂停执行您正在通过它运行的任何内容(即:Django),以便您可以在队列处理器的输出可用时使用它。无需轮询或回调。
If you use Python, you can take advantage of RabbitMQ and Celery to do the job. Celery lets you create an item in a queue and then pause execution of whatever you're running it through (i.e.: Django) such that you can consume the output of the queue processor as it becomes available. No need for polling OR callbacks.