大量文件句柄到大文件 - 潜在问题?
在程序的生命周期内(例如一周左右)保持对大小为 3GB 以上的文件的 512 个文件句柄打开,会导致 32 位 Linux 出现问题吗?视窗?
潜在的解决方法:打开/关闭文件句柄的性能损失有多严重?
Would keeping say 512 file handles to files sized 3GB+ open for the lifetime of a program, say a week or so, cause issues in 32-bit Linux? Windows?
Potential workaround: How bad is the performance penalty of opening/closing file handles?
如果你对这篇内容有疑问,欢迎到本站社区发帖提问 参与讨论,获取更多帮助,或者扫码二维码加入 Web 技术交流群。
绑定邮箱获取回复消息
由于您还没有绑定你的真实邮箱,如果其他用户或者作者回复了您的评论,将不能在第一时间通知您!
发布评论
评论(2)
文件的大小并不重要。不过,文件描述符的数量确实如此。例如,在 Mac OS X 上,默认限制是每个进程打开 256 个文件,因此您的程序将无法运行。
The size of the files doesn't matter. The number of file descriptors does, though. On Mac OS X, for example, the default limit is 256 open files per process, so your program would not be able to run.
我不了解 Linux,但在 Windows 中,512 个文件对我来说似乎并不算多。但根据经验,超过一千就太多了。 (尽管我不得不说,我还没有亲眼见过任何程序打开超过 50 个。)
并且打开/关闭句柄的成本并没有那么大,除非您每次想要阅读时都这样做/写入少量,在这种情况下它太高,您应该缓冲数据。
I don't know about Linux, but in Windows, 512 files doesn't seem that much to me. But as a rule of thumb, any more than a thousand and it's too many. (Although I have to say that I haven't seen any program first-hand opening more than, say, 50.)
And the cost of opening/closing handles isn't that big unless you do them every time you want to read/write a small amount, in which case it's too high and you should buffer your data.