PHP:scandir() 太慢

发布于 2024-10-20 00:37:15 字数 116 浏览 4 评论 0原文

我必须创建一个将所有子文件夹列出到文件夹中的函数。我有一个无文件过滤器,但该函数使用 scandir() 进行列表。这使得应用程序非常慢。是否有 scandir() 的替代方案,甚至是非本机 php 函数? 提前致谢!

I have to make a function that lists all subfolders into a folder. I have a no-file filter, but the function uses scandir() for listing. That makes the application very slow. Is there an alternative of scandir(), even a not native php function?
Thanks in advance!

如果你对这篇内容有疑问,欢迎到本站社区发帖提问 参与讨论,获取更多帮助,或者扫码二维码加入 Web 技术交流群。

扫码二维码加入Web技术交流群

发布评论

需要 登录 才能够评论, 你可以免费 注册 一个本站的账号。

评论(2

千寻… 2024-10-27 00:37:15

您可以使用 readdir ,它可能会更快,如下所示:

function readDirectory($Directory,$Recursive = true)
{
    if(is_dir($Directory) === false)
    {
        return false;
    }

    try
    {
        $Resource = opendir($Directory);
        $Found = array();

        while(false !== ($Item = readdir($Resource)))
        {
            if($Item == "." || $Item == "..")
            {
                continue;
            }

            if($Recursive === true && is_dir($Item))
            {
                $Found[] = readDirectory($Directory . $Item);
            }else
            {
                $Found[] = $Directory . $Item;
            }
        }
    }catch(Exception $e)
    {
        return false;
    }

    return $Found;
}

可能需要一些调整,但这本质上是 scandir 所做的,并且它应该更快,如果不是,请编写更新,因为我想看看是否可以制定更快的解决方案。

另一个问题是,如果您读取一个非常大的目录,您会在内部存储器中填充一个数组,而这可能就是您的内存所在的位置。

您可以尝试创建一个读取偏移量的函数,以便一次可以返回 50 个文件!

一次读取文件块使用起来也很简单,如下所示:

$offset = 0;
while(false !== ($Batch = ReadFilesByOffset("/tmp",$offset)))
{
    //Use $batch here which contains 50 or less files!

    //Increment the offset:
    $offset += 50;
}

You can use readdir which may be faster, something like this:

function readDirectory($Directory,$Recursive = true)
{
    if(is_dir($Directory) === false)
    {
        return false;
    }

    try
    {
        $Resource = opendir($Directory);
        $Found = array();

        while(false !== ($Item = readdir($Resource)))
        {
            if($Item == "." || $Item == "..")
            {
                continue;
            }

            if($Recursive === true && is_dir($Item))
            {
                $Found[] = readDirectory($Directory . $Item);
            }else
            {
                $Found[] = $Directory . $Item;
            }
        }
    }catch(Exception $e)
    {
        return false;
    }

    return $Found;
}

May require some tweeking but this is essentially what scandir does, and it should be faster, if not please write an update as i would like to see if i can make a faster solution.

Another issue is if your reading a very large directory your filling an array up within the internal memory and that may be where your memory is going.

You could try and create a function that reads in offsets so that you can return 50 files at a time!

reading chunks of files at a time would be just as simple to use, would be like so:

$offset = 0;
while(false !== ($Batch = ReadFilesByOffset("/tmp",$offset)))
{
    //Use $batch here which contains 50 or less files!

    //Increment the offset:
    $offset += 50;
}
悲喜皆因你 2024-10-27 00:37:15

不要自己写。 PHP 有一个专门为此构建的递归目录迭代器:

http://php.net/manual/ en/class.recursivedirectoryiterator.php

根据经验(也不是 100% 的情况),由于它是直接用 C 实现的,所以用 PHP 构建的任何内容都会变慢。

Don't write your own. PHP has a Recursive Directory Iterator built specifically for this:

http://php.net/manual/en/class.recursivedirectoryiterator.php

As a rule of thumb (aka not 100% of the time), since it's implemented in straight C, anything you build in PHP is going to be slower.

~没有更多了~
我们使用 Cookies 和其他技术来定制您的体验包括您的登录状态等。通过阅读我们的 隐私政策 了解更多相关信息。 单击 接受 或继续使用网站,即表示您同意使用 Cookies 和您的相关数据。
原文