C# - 如何快速、优化地列出子目录中的文件

发布于 2024-12-06 21:39:42 字数 1402 浏览 0 评论 0原文

我正在尝试使用以下方法列出根目录的所有子目录中的文件。但当文件数量达到数百万时,它会花费很多时间。有没有更好的方法来做到这一点。

我正在使用 .NET 3.5,因此无法使用枚举器:-(

        ******************* Main *************
        DirectoryInfo dir = new DirectoryInfo(path);
        DirectoryInfo[] subDir = dir.GetDirectories();
        foreach (DirectoryInfo di in subDir) //call for each sub directory
        {
             PopulateList(di.FullName, false);
        }

        *******************************************
        static void PopulateList(string directory, bool IsRoot)
        {

            System.Diagnostics.ProcessStartInfo procStartInfo = new System.Diagnostics.ProcessStartInfo("cmd", "/c " + "dir /s/b \"" + directory + "\"");
            procStartInfo.RedirectStandardOutput = true;
            procStartInfo.UseShellExecute = false;
            procStartInfo.CreateNoWindow = true;
            System.Diagnostics.Process proc = new System.Diagnostics.Process();
            proc.StartInfo = procStartInfo;
            proc.Start();

            string fileName = directory.Substring(directory.LastIndexOf('\\') + 1);
            StreamWriter writer = new StreamWriter(fileName + ".lst");

            while (proc.StandardOutput.EndOfStream != true)
            {
                 writer.WriteLine(proc.StandardOutput.ReadLine());
                 writer.Flush();
            }
            writer.Close();
        }

I am trying to list the files in all the sub-directories of a root directory with the below approach. But its taking much time when the number of files are in millions. Is there any better approach of doing this.

I am using .NET 3.5 so can't use enumerator :-(

        ******************* Main *************
        DirectoryInfo dir = new DirectoryInfo(path);
        DirectoryInfo[] subDir = dir.GetDirectories();
        foreach (DirectoryInfo di in subDir) //call for each sub directory
        {
             PopulateList(di.FullName, false);
        }

        *******************************************
        static void PopulateList(string directory, bool IsRoot)
        {

            System.Diagnostics.ProcessStartInfo procStartInfo = new System.Diagnostics.ProcessStartInfo("cmd", "/c " + "dir /s/b \"" + directory + "\"");
            procStartInfo.RedirectStandardOutput = true;
            procStartInfo.UseShellExecute = false;
            procStartInfo.CreateNoWindow = true;
            System.Diagnostics.Process proc = new System.Diagnostics.Process();
            proc.StartInfo = procStartInfo;
            proc.Start();

            string fileName = directory.Substring(directory.LastIndexOf('\\') + 1);
            StreamWriter writer = new StreamWriter(fileName + ".lst");

            while (proc.StandardOutput.EndOfStream != true)
            {
                 writer.WriteLine(proc.StandardOutput.ReadLine());
                 writer.Flush();
            }
            writer.Close();
        }

如果你对这篇内容有疑问,欢迎到本站社区发帖提问 参与讨论,获取更多帮助,或者扫码二维码加入 Web 技术交流群。

扫码二维码加入Web技术交流群

发布评论

需要 登录 才能够评论, 你可以免费 注册 一个本站的账号。

评论(5

水中月 2024-12-13 21:39:42

删除所有与进程相关的内容并尝试 目录。 GetDirectories ()Directory.GetFiles() 方法:

public IEnumerable<string> GetAllFiles(string rootDirectory)
{
    foreach(var directory in Directory.GetDirectories(
                                            rootDirectory, 
                                            "*", 
                                            SearchOption.AllDirectories))
    {
        foreach(var file in Directory.GetFiles(directory))
        {
            yield return file;
        }
    }
}

来自 MSDN,搜索选项.AllDirectories:

搜索中包括当前目录和所有子目录
手术。此选项包括重新分析点,例如安装的驱动器和
搜索中的符号链接。

Remove all Process-related stuff and try out Directory.GetDirectories () and Directory.GetFiles() methods:

public IEnumerable<string> GetAllFiles(string rootDirectory)
{
    foreach(var directory in Directory.GetDirectories(
                                            rootDirectory, 
                                            "*", 
                                            SearchOption.AllDirectories))
    {
        foreach(var file in Directory.GetFiles(directory))
        {
            yield return file;
        }
    }
}

From MSDN, SearchOption.AllDirectories:

Includes the current directory and all the subdirectories in a search
operation. This option includes reparse points like mounted drives and
symbolic links in the search.

始于初秋 2024-12-13 21:39:42

在每个目录的循环中使用 DirectoryInfo.GetFiles 肯定会更快,而不是产生大量新进程来读取其输出。

It will be definitely faster to use DirectoryInfo.GetFiles in a loop for each directory instead of spawning tons of new processes to read thier output.

谁与争疯 2024-12-13 21:39:42

对于数百万个文件,您实际上遇到了文件系统限制(请参阅 this 并搜索“300,000”),因此请考虑到这一点。

至于优化,我认为您确实想要惰性迭代,因此您必须 P/Invoke 到 FindFirstFile/FindNextFile

With millions of files you're actually running into filesystem limitation (see this and search for "300,000"), so take this into account.

As for optimizations, I think you'd really want to iterate lazily, so you'll have to P/Invoke into FindFirstFile/FindNextFile.

雾里花 2024-12-13 21:39:42

查看已有的 Directory.GetFiles 重载。
例如:

var paths = Directory.GetFiles(root, "*", SearchOption.AllDirectories);

是的,这需要很多时间。但我认为仅使用 .Net 类无法提高其性能。

Check out already available Directory.GetFiles overload.
For example:

var paths = Directory.GetFiles(root, "*", SearchOption.AllDirectories);

And yes it will take a lot of time. But I don't think that you can increase its performance using only .Net classes.

拒绝两难 2024-12-13 21:39:42

假设您的数百万个文件分布在多个子目录中并且您使用的是 .NET 4.0,您可以查看并行扩展。

使用并行的 foreach 循环来处理子目录列表可以使事情变得更快。

新的并行扩展也比在较低级别尝试多线程更安全、更易于使用。

需要注意的一件事是确保将并发进程的数量限制在合理的范围内。

Assuming that your millions of files are spread across multiple sub-directories and you're using .NET 4.0, you could look at the parallel extensions.

Using a parallel foreach loop to process the list of sub-directories, could make things a lot faster.

The new parallel extensions are also a lot safer and easier to use than attempting multi-threading at a lower-level.

The one thing to look out for is making sure that you limit the number of concurrent processes to something sensible.

~没有更多了~
我们使用 Cookies 和其他技术来定制您的体验包括您的登录状态等。通过阅读我们的 隐私政策 了解更多相关信息。 单击 接受 或继续使用网站,即表示您同意使用 Cookies 和您的相关数据。
原文