Rails:跟踪者、救援或其他

发布于 2024-11-25 16:16:54 字数 1146 浏览 2 评论 0原文

实际问题

我正在通过 mongoid 使用 Rails 3.0.4 和 MongoDB。

我正在使用 @mailgun 发送我的电子邮件。我发送批量(如新闻通讯)和交易(如激活帐户、忘记密码)电子邮件。现在,我正在使用单个域(转换为 mailgun 端的单个队列)来传递这些邮件。当我已经有大量批量电子邮件排队并且有人注册或请求新密码时,就会出现问题。我希望事务性电子邮件在批量邮件之前发送,但 mailgun 队列以 FIFO 为基础工作。

我认为缓解这种情况的方法可以是对批量邮件和事务性邮件使用不同的域(因此可以同时处理不同的队列)。在rails中,smtp设置是应用程序级别设置而不是请求级别设置。所以,我想,我将使用不同的环境来进行不同的 smtp 设置。

我还有一个邮件队列系统,并使用 delayed_job 来处理该系统。我无法找出区分延迟作业中的批量邮件和交易邮件的方法。因此,我决定将我的队列系统移至rescue+redis 或beanstalked+stalker,在其中我可以标记队列并可以要求工作人员仅处理特定队列。

问题

我想要一些更容易维护、资源占用最少并且可以很好扩展的东西。

  • 使用delayed_job,我不需要运行任何其他服务器并监视它。
  • 对于delayed_job,我在机架空间上使用256MB切片,但redis和stalker需要另一台服务器,resque或beanstalkd。
  • 我不知道如何扩展,但自从我的应用程序推出以来已经是第二个月了,我已经发送了 3 万多封电子邮件。

如果有任何从delayed_job 移植到redis 或stalker 的替代方案,请告诉我。

更新:

似乎 delayed_job 现在也支持命名队列,但它是尚未记录。打开票证以添加文档,一旦我知道如何使用它们,将更新详细信息:)

The actual problem

I am using Rails 3.0.4 and MongoDB through mongoid.

I am using @mailgun to deliver my emails. I send both bulk(like news-letter) and transactional(like activate account, forgot password) emails. Right now, I am using a single domain(translates to single queue on mailgun's end) to deliver those mails. Problem arises when I have lot of bulk emails already queued up and somebody registers or requests for a new password. I want the transactional emails to be delivered before the bulk mails, but mailgun queue works on a FIFO basis.

I figured a way to mitigate this can be to use different domains(hence different queues which can be processed simultaneously) for bulk and transactional mails. In rails smtp settings are application level settings rather than request level settings. So, I figured, I will use different environments for different smtp setting.

I also have a queue system for mails and am using delayed_job to handle that. I can't figure out a way to differentiate between bulk and transactional mails in delayed_job. So, I decided to move my queue system to either rescue+redis or beanstalked+stalker, where I can tag queues and can ask a worker to only process a particular queue.

The question

I want something which is easier to maintain, least resource hungry and can scale well.

  • With delayed_job I needn't run any other server and monitor it.
  • For delayed_job I was using 256MB slice on rackspace, but redis and stalker would require another server, either resque or beanstalkd.
  • I have no idea about scaling, but its 2nd month since my app launched and I have already sent 30k+ emails.

If there any alternatives to porting from delayed_job to redis or stalker, please let me know.

Update:

Seems like delayed_job also have support for named queues now, but it is not documented yet. Opening a ticket to add documentation, will update with details, once I know how to use them :)

如果你对这篇内容有疑问,欢迎到本站社区发帖提问 参与讨论,获取更多帮助,或者扫码二维码加入 Web 技术交流群。

扫码二维码加入Web技术交流群

发布评论

需要 登录 才能够评论, 你可以免费 注册 一个本站的账号。

评论(1

如何视而不见 2024-12-02 16:16:54

延迟作业接受可选优先级参数(在链接页面末尾附近有解释) 。

例如:

Delayed::Job.enqueue(MailingJob.new(params[:id]), 3)

...其中 3 是优先级。

因此,在对群发邮件进行排队时,不要指定优先级,而在对事务性电子邮件进行排队时,请给它们更高的优先级。这样,您的事务性电子邮件将先于尚未发出的任何邮件进入 @mailgun 队列。

除非你的外发 SMTP 服务器有某种缓慢的连接,否则它可能每分钟发送几百封电子邮件,所以如果有,例如,200 封群发邮件已经移交给 @mailgun 并且你,我不会太担心正在等待交易电子邮件;此后不久它仍会发送。

Delayed job accepts an optional priority parameter (it's explained down near the end of the linked page).

For example:

Delayed::Job.enqueue(MailingJob.new(params[:id]), 3)

...where 3 is the priority.

So, when en-queuing your mass-mailing, don't specify a priority, and when en-queuing your transactional emails give them something higher. This way, your transactional emails will get entered into the @mailgun queue ahead of anything that isn't already on its way out.

Unless your outgoing SMTP server has some kind of slow connection, it'll probably send several hundred emails per minute, so I wouldn't worry too much if there are, for example, 200 mass-mailings already handed off to @mailgun and you're waiting on a transactional email; it will still send shortly thereafter.

~没有更多了~
我们使用 Cookies 和其他技术来定制您的体验包括您的登录状态等。通过阅读我们的 隐私政策 了解更多相关信息。 单击 接受 或继续使用网站,即表示您同意使用 Cookies 和您的相关数据。
原文