在 Linode 512 VPS 上运行 MongoDB 的可行性?

发布于 2024-09-16 00:40:51 字数 274 浏览 12 评论 0原文

我读过 MongoDB 文档,它附带了有关 32 位系统的警告;特别是 Mongo 的可用 RAM 上限为 2 GB。

所以我想知道该声明对于在低内存设置(例如 Linode 的 512mb RAM 基于 VPS)中运行 MongoDB 是否意味着什么。我怀疑这很重要,但对于 64 位发行版来说,以后可以添加资源。

有任何警告吗?或者有人会推荐在 MongoDB 的 VPS 中选择的最低 RAM 吗?我知道项目需求会影响到这一点,但我的意思是最低限度,如果不启动 MongoDB,做什么都是可笑的。

I've read the MongoDB documentation, and it comes with a warning about 32-bit systems; particularly that they're capped at 2 GB of available RAM for Mongo.

So I was wondering if that statement meant anything in regards in running MongoDB in a low memory setting like Linode's 512mb of RAM based VPS. I doubt it matters but with a 64-bit distro for later resource adding.

Any warnings or would someone recommend a minimum of a RAM to choose in a VPS for MongoDB? I know project requirements figure into this, but I mean minimum in the sense of what'd be laughable to do without for even spinning MongoDB up.

如果你对这篇内容有疑问,欢迎到本站社区发帖提问 参与讨论,获取更多帮助,或者扫码二维码加入 Web 技术交流群。

扫码二维码加入Web技术交流群

发布评论

需要 登录 才能够评论, 你可以免费 注册 一个本站的账号。

评论(1

许久 2024-09-23 00:40:51

我在一些在我的 Linode 512 和 512 中运行的小型生产站点上使用它。它几乎不使用任何内存,只有大约 5 - 6MB。我的数据集目前非常小。

MongoDB 使用映射内存存储引擎,这意味着它依赖操作系统缓存将经常使用的数据保存在内存中。

http://www.mongodb.org/display/DOCS/Caching

所以除非你有对于一个巨大的数据集,Linode 512 应该没问题。

在研究这个问题时,我有点担心的一件事是,MongoDB 在内存不足时似乎会崩溃,而且没有太多警告。也很难准确确定它将使用多少内存或磁盘空间与您拥有的数据量成正比。也无法指定硬限制,但代价是性能下降。您可能想要监控它。

您可以尝试使用 --smallfiles --noprealloc 选项运行 MongoDB,因为这允许较小的数据库文件启动而不是预先分配它们,如果您有一个小数据集,可以节省磁盘空间。

以下是一位用户的体验:

http://groups.google.com/group /mongodb-user/browse_thread/thread/223810a749f0e1eb

不幸的是,该线程没有得到解决,如果他们有崩溃的原因就好了。

这也很好读:

http://groups.google。 com/group/mongodb-user/browse_thread/thread/2646a52c4f41d832/d43f3ba7bbbbd63d

I use it on some small production sites which are running within my Linode 512 & it's using barely any memory, only a about 5 - 6MB. My dataset is very small for now.

MongoDB uses a mapped memory storage engine, which means it relies on the OS system cache to keep frequently used data in memory.

http://www.mongodb.org/display/DOCS/Caching

So unless you have a huge dataset to start out with, a Linode 512 should be OK.

The one thing I am a little concerned with when researching this is that MongoDB seems to crash when it runs out of memory, without much warning. It's also hard to pin down exactly how much memory or disk space it's going to use proportional to how much data you have. There also is no way to specify hard limits, at the cost of degraded performance. It's something you'll probably want to monitor.

You might try running MongoDB with the --smallfiles --noprealloc options, as this allows for smaller database files to start with and not preallocate them, saving disk space if you have a small dataset.

Here is one user's experience:

http://groups.google.com/group/mongodb-user/browse_thread/thread/223810a749f0e1eb

Unfortunately that thread was not resolved, would have been nice if they had a reason for the crash.

This also is good to read:

http://groups.google.com/group/mongodb-user/browse_thread/thread/2646a52c4f41d832/d43f3ba7bbbbd63d

~没有更多了~
我们使用 Cookies 和其他技术来定制您的体验包括您的登录状态等。通过阅读我们的 隐私政策 了解更多相关信息。 单击 接受 或继续使用网站,即表示您同意使用 Cookies 和您的相关数据。
原文