计算每个客户的磁盘使用情况的最快方法是什么?
我希望这是一个简单的问题。
我运行一个 Rails Web 应用程序,在其中托管大约 100 个学校网站。一个应用程序处理所有站点,我有一个管理界面,我们可以在其中添加和删除学校等...
我想向此界面添加一个统计信息,即该学校使用的总磁盘空间。每个学校的文件都存储在单独的目录结构中,以便于查找。唯一的问题是我需要它快点。所以问题是找到此信息的最快方法是什么。如果可以通过即时红宝石调用找到它,那就太好了,但我对任何可行的方法持开放态度。理想情况下,我希望避免缓存和后台生成此数据(至少在轨道级别)。 :)
I'm hoping this is a simple one.
I run a Rails web app where I'm hosting about 100 school websites. The one app handles all the sites, and I have a management interface where we can add and remove schools etc...
I want to add a stat to this interface which is the total disk space used by that school. Each schools files are stored in a seperate directory structure so that's easy to find out. The only problem is I need it to be fast. So the question is what's the fastest way to find this info. If it could be found via a ruby call on the fly that would be great, but I'm open to whatever will work. Ideally I'd like to avoid having to cache and background generate this data (at least at the rails level). :)
如果你对这篇内容有疑问,欢迎到本站社区发帖提问 参与讨论,获取更多帮助,或者扫码二维码加入 Web 技术交流群。
绑定邮箱获取回复消息
由于您还没有绑定你的真实邮箱,如果其他用户或者作者回复了您的评论,将不能在第一时间通知您!
发布评论
评论(3)
如果你想使用纯 Ruby,你可以尝试这段代码。不过,如果您追求速度,我确信
du
会更快。If you want to go with pure Ruby you can try this code. Although if you're looking for speed I'm sure
du
would be faster.您是否尝试过按需在每个目录上运行 du ?在我的老化机器上,我可以在约 4 毫秒内对 15M 目录执行 du,在约 50 毫秒内执行 250M 目录。对于手头的任务来说,这两者似乎都是合理的。目录有多大?在您尝试真正优化之前,请确保它确实值得您花时间。亚格尼等等。
当他们向您提供文件时,您可以随时跟踪上传情况。这样,您只需在添加或删除文件时跟踪增量即可。
Have you tried just running du on each directory on demand? On my aging box I can do a du on a 15M directory in ~4ms and a 250M in ~50ms. These both seems reasonable for the task at hand. How large are the directories? Before you try to really optimize this make sure that its really worth your while. YAGNI and all that.
You could always keep track on the upload when they provide you with the file. That way you just need to track the delta as files are added or removed.