循环中的 PDO::fetchAll 与 PDO::fetch
只是一个简单的问题。
在循环中使用 PDO::fetchAll() 和 PDO::fetch() (对于大型结果集)是否有性能差异?
我正在获取用户定义类的对象,如果这有什么区别的话。
我最初的未经教育的假设是 fetchAll 可能会更快,因为 PDO 可以在一条语句中执行多个操作,而 mysql_query 只能执行一个操作。然而,我对 PDO 的内部工作原理知之甚少,文档也没有说明这一点,也没有说明 fetchAll() 是否只是一个转储到数组中的 PHP 端循环。
有什么帮助吗?
Just a quick question.
Is there any performance difference between using PDO::fetchAll() and PDO::fetch() in a loop (for large result sets)?
I'm fetching into objects of a user-defined class, if that makes any difference.
My initial uneducated assumption was that fetchAll might be faster because PDO can perform multiple operations in one statement while mysql_query can only execute one. However I have little knowledge of PDO's inner workings and the documentation doesn't say anything about this, and whether or not fetchAll() is simply a PHP-side loop dumped into an array.
Any help?
如果你对这篇内容有疑问,欢迎到本站社区发帖提问 参与讨论,获取更多帮助,或者扫码二维码加入 Web 技术交流群。
绑定邮箱获取回复消息
由于您还没有绑定你的真实邮箱,如果其他用户或者作者回复了您的评论,将不能在第一时间通知您!
发布评论
评论(7)
带有 200k 随机记录的小基准。正如预期的那样,fetchAll 方法速度更快,但需要更多内存。
使用的基准代码:
Little benchmark with 200k random records. As expected, the fetchAll method is faster but require more memory.
The benchmark code used :
我发现关于 PHP 的一件事几乎总是是正确的,那就是您自己实现的函数几乎总是比 PHP 的等效函数慢。这是因为当用 PHP 实现某些东西时,它没有 C 所具有的所有编译时优化(PHP 是用 C 编写的),并且 PHP 函数调用的开销很高。
One thing about PHP that I've found to be true almost always is that a function you implement yourself will almost always be slower than the PHP equivalent. This is because when something is implemented in PHP it doesn't have all the compile time optimizations that C has (which PHP is written in) and there is high overhead of PHP function calls.
由于非常简单的原因,所有高于其测量“内存占用量”的基准实际上都是不正确的。
默认情况下,PDO 会将所有内容加载到内存中,并且它并不关心您使用 fetch 还是 fetchAll。
要真正获得无缓冲查询的好处,您应该指示 PDO 使用无缓冲查询:
$db->setAttribute(PDO::MYSQL_ATTR_USE_BUFFERED_QUERY, false);
在这种情况下,您将看到脚本
all benchmarks above which measure "memory footprint" are actually incorrect for the very simple reason.
PDO by default does load all the things into the memory and it does not care if you use fetch or fetchAll.
To really get benefits of unbuffered query you should instruct PDO to use unbuffered queries:
$db->setAttribute(PDO::MYSQL_ATTR_USE_BUFFERED_QUERY, false);
In that case you will see huge difference in memory footprint of the script
@Arkh
内存差异应该变得可以忽略不计
@Arkh
The memory difference should become neglijable
正如 Mihai Stancu 所说,尽管 fetchAll 胜过 fetch + while,但几乎没有内存差异。
我通过正确运行 while 得到了上面的结果:
因此 fetchAll 消耗更少的内存,但 fetch + while 更快! :)
As Mihai Stancu was saying, there is almost no memory difference though fetchAll beats fetch + while.
I got the results above with running while correctly:
So the fetchAll consumes less memory, but fetch + while is faster! :)
但如果您将获取的数据存储在数组中,那么内存使用量肯定是相等的吗?
But surely if you're storing the fetched data in an array, the memory usage will be equal?
我知道这是一个老话题,但我遇到了同样的问题。在运行了我自己的简单“基准”并阅读了其他人在这里写的内容后,我得出的结论是,这不是一门精确的科学,虽然人们应该努力编写高质量、轻量级的代码,但在一开始就浪费太多时间是没有意义的。项目的。
我的建议是:通过运行代码(测试版?)一段时间来收集数据,然后开始优化。
在我的简单基准测试中(仅测试执行时间),我得到的结果在 5% 和 50% 之间变化。我在同一个脚本中运行这两个选项,但是当我首先运行 fetch + while 时,它比 fetchall 更快,反之亦然。 (我知道我应该单独运行它们,几百次得到中位数和平均值,然后进行比较,但是 - 正如我在开始时所说的 - 我的结论是,就我而言,现在开始这样做还为时过早。)
I know this is an old topic, but I run across this having the same question. Having run my own simple "benchmark" and reading what others wrote here I came to the conclusion that this is not an exact science and while one should strive to write the quality, light code, there is no point wasting too much time at the start of the project.
My suggestion is: Gather data by running the code(in beta?) for a while and then start optimizing.
In my simple benchmark (only tested execution time) I've got results varying between 5% and 50% BOTH ways. I run both options in the same script, but when I run fetch + while first it has been faster than fetchall and vice versa. (I know I should have run them single and couple hundred times get the median and mean and then compare, but - as I have said at the begining - I concluded that in my case it is too early to start doing so.)