当我们在包含大量文件的目录中打开文件时,性能是否会下降?
假设我们要打开一个目录中的一个文件,但文件数量巨大。当我请求程序打开其中的文件时,它搜索该特定文件的速度有多快?在这种情况下查找所请求的文件会导致性能下降吗?
附言。这也应该取决于文件系统的实现,是吗?
Suppose we want to open a file in a directory, but there are huge numbers of files. When I requested the program to open a file in there, how fast can it search for this particular file? Will there be performance drop for looking for the requested file in this case?
PS. This should also depend on the file systems implementation, yes?
如果你对这篇内容有疑问,欢迎到本站社区发帖提问 参与讨论,获取更多帮助,或者扫码二维码加入 Web 技术交流群。
绑定邮箱获取回复消息
由于您还没有绑定你的真实邮箱,如果其他用户或者作者回复了您的评论,将不能在第一时间通知您!
发布评论
评论(1)
是的,这在很大程度上取决于文件系统的实现。
某些文件系统对大型目录有特定的优化。我能想到的一个例子是 ext3,它使用 HTree 大型目录索引。
一般来说,查找文件通常会有一些延迟。然而,一旦找到/打开文件,读取该文件不应比读取任何其他文件慢。
一些需要处理大量文件(例如用于缓存)的程序将它们放入大型目录树中,以减少每个目录的条目数。
Yes, it depends a lot on the file system implementation.
Some file systems have specific optimizations for large directories. One example I can think of is is ext3, which uses HTree indexing for large directories.
Generally speaking there will usually be some delay to find the file. Once the file is located/opened, however, reading it should not be slower than reading any other file.
Some programs that need to handle a large amount of files (for caching, for example) put them in a large directory tree, to reduce the number of entries per directory.