Rails:计划任务来预热缓存?
我正在使用以下内容来使用 memcached 缓存加载缓慢的页面:
caches_action :complex_report, :expires_in => 1.day
控制器操作受 Devise 身份验证保护。
当前,该页面在用户第一次请求时被缓存。当天的后续请求将从缓存中提取。
这样做的问题是初始请求需要 20-30 秒才能加载。是否可以通过计划任务的方式提前填充缓存?
任何建议都非常感激。
I am using the following to cache a slow loading page using memcached:
caches_action :complex_report, :expires_in => 1.day
The controller action is protected by Devise authentication.
The page currently gets cached the first time a user requests it. Subsequent request that day are then pulled from the cache.
The problem with this is that the initial request takes 20-30 seconds to load. Is it possible to populate the cache in advance by way of a scheduled task?
Any suggestions much appreciated.
如果你对这篇内容有疑问,欢迎到本站社区发帖提问 参与讨论,获取更多帮助,或者扫码二维码加入 Web 技术交流群。
绑定邮箱获取回复消息
由于您还没有绑定你的真实邮箱,如果其他用户或者作者回复了您的评论,将不能在第一时间通知您!
发布评论
评论(4)
如果运行报告和收集结果的过程非常耗时,您可以使用
Rails.cache.write
和缓存这些结果(代替或并行操作缓存) >Rails.cache.read
。然后,因为您不必担心身份验证或向服务器发出请求,所以运行查询并缓存 cron 作业结果的行为将变得相当简单。
If it's the process of running the report and collecting results that is time-consuming, you could cache those results (in place of, or along-side action caching) using
Rails.cache.write
andRails.cache.read
.Then, because you needn't worry about authentication or making requests to the server, the act of running the query and caching the results from a cron job would be considerably simpler.
看看这个 gem:
https://github.com/tommyh/preheat
这个 gem 用于预热你的 Rails.cache。
从文档中:
这将“预热”主页上的所有 Rails.cache.fetch 调用。就这么简单!
Take a look at this gem:
https://github.com/tommyh/preheat
The gem is for preheating your Rails.cache.
From the documentation:
This will "preheat" all your Rails.cache.fetch calls on your homepage. It is as simple as that!
最基本的解决方案可能是设置一个简单的
cron
条目来加载您想要拥有“热”缓存的页面。使用crontab -e
打开编辑器,将以下内容添加到服务器上用户的crontab
中非常简单:*/15 * * * * wget -q http://yourwebpages.url/ > /dev/null 2>&1
这样做是使用
wget
每小时、每天、每月、每年每 15 分钟从提供的 url 获取数据,忽略结果,并且不发送 *nix 邮件,以防出现问题。Probably the most basic solution would be to set up a simple
cron
entry to load up the page you'll want to have a 'hot' cache. This can be as easy adding the following to thecrontab
of a user on your server usingcrontab -e
to open an editor:*/15 * * * * wget -q http://yourwebpages.url/ > /dev/null 2>&1
What this will do is use
wget
to fetch the data at the provided url every 15 minutes of every hour, day, month and year, ignore the results and not send *nix mail in case something goes wrong.这是对之前基于 cron 的解决方案的扩展,它使用curl 存储 cookie 的功能,以便您可以在一个步骤中进行身份验证,然后在下一步中作为经过身份验证的用户再次使用 cookie。因此,如果您将这些行放入名为“prepare_cache.sh”的脚本中,
将登录名和密码参数替换为与登录表单中使用的变量以及显然要调用的网址相匹配的参数。我之前删除了 cookiejar,以确保那里已经不存在文件,并在最后删除它,以确保不存在具有不应具有的访问级别的 cookie。
然后您可以使用 cron 作业调用此脚本:
希望这应该有效。当我尝试时似乎对我有用。
Here is an expansion on the previous cron based solution which uses curl's ability to store cookies so that you can auth in one step and then use the cookie again as an authenticated user in the next step. So if you put these lines in a script called "prepare_cache.sh"
replacing the login and password parameters with ones which match the variables used in your login form and obviously the urls to call. I'm removing the cookiejar before to make sure there isn't a file there already and removing it at the end to make sure there isn't a cookie floating about with access levels it shouldn't have.
Then you can call this script with the cron job:
And hopefully that should work. Seemed to work for me when I tried it.