PDO / PHP / MySQL 中的性能:事务与直接执行
我正在循环访问多个值(例如 1 到 100)并在循环内执行准备好的语句。
与循环内直接执行相比,使用事务(循环结束后提交)是否有优势?
这些值并不相互依赖,因此从这个角度来看不需要进行交易。
I am looping through a number of values (1 to 100 for example) and executing a prepared statement inside the loop.
Is there and advantage to using a transaction - committing after the loop ends - compared to a direct execution inside the loop?
The values are not dependant on each other so a transaction is not needed from that point of view.
如果你对这篇内容有疑问,欢迎到本站社区发帖提问 参与讨论,获取更多帮助,或者扫码二维码加入 Web 技术交流群。
绑定邮箱获取回复消息
由于您还没有绑定你的真实邮箱,如果其他用户或者作者回复了您的评论,将不能在第一时间通知您!
发布评论
评论(3)
如果您的查询是 INSERT,则页面 7.2.19。 MySQL 手册的 INSERT 语句的速度提供了两个有趣的信息,具体取决于您是否使用事务引擎:
当使用非事务引擎时:
并且,使用事务引擎:
所以我猜测使用事务可能是一个好主意 - 但我认为这可能取决于服务器上的负载,以及是否有多个用途同时使用同一个表,以及所有这些......
还有更多我链接到的页面上有信息,所以请不要犹豫,阅读它;-)
而且,如果您正在执行 更新语句 :
所以,我猜对于插入来说也可以这样说。
顺便说一句:可以肯定的是,您可以尝试这两种解决方案,并使用
microtime
对它们进行基准测试,例如,在 PHP 方面;-)If your queries are INSERTs, the page 7.2.19. Speed of INSERT Statements of the MySQL manual gives two interesting informations, depending on whether your are using a transactionnal engine or not :
When using a non-transactionnal engine :
And, with a transactionnal engine :
So I am guessing using transactions might be a good idea -- but I suppose that could depend on the load on your server, and whether there are multiple uses using the same table at the same moment, and all that...
There are more informations on the page I linked to, so don't hesitate to read it ;-)
And, if you are doing update statements :
So, I'm guessing the same can be said than for inserts.
BTW : to be sure, you can try both solutions, benchmarking them with
microtime
, on the PHP side, for instance ;-)为了更快地完成所有插入,您可以一次完成所有插入,或者将它们分组在一起,也许一次 5 或 10 个,就好像一个插入失败了整个批次一样。
http://www.desilva.biz/mysql/insert.html
事务将减慢你的速度,所以如果你不需要它,就不要使用它。
即使您进行了批量插入,准备好的语句也是一个不错的选择,因为您不必每次都继续构建查询。
For a faster time you could do all the inserts in one shot, or group them together, perhaps 5 or 10 at a time, as if one insert fails the entire batch will.
http://www.desilva.biz/mysql/insert.html
A transaction will slow you down, so if you don't need it then don't use it.
A prepared statement would be a good choice though even if you did batch inserts, as you don't have to keep building up the query each time.
当我必须实现 CSV 文件(可能很长)数据导入时,我遇到了同样的问题(我知道你可以使用 LOAD DATA INFILE 语法,但我必须在插入之前对我的字段进行一些处理)。
所以我用事务和大约 15k 行的文件进行了实验。结果是,如果我将所有记录插入一个唯一的事务中,则只需要几秒钟,并且受 CPU 限制。如果我根本不使用任何事务,则需要几分钟并且它是 IO 限制的。
通过每 N 行提交一次,我得到了中间结果。
I faced the same question when I had to implement a CSV file (possibly quite long) data import (I know you can use the LOAD DATA INFILE syntax for that but I had to apply some processing on my fields before insertion).
So I made an experiment with transactions and a file with about 15k rows. The result is that if I insert all records inside one unique transaction, it takes only a few seconds and it's cpu bound. If I don't use any transaction at all, it takes several minutes and it's IO bounded.
By committing every N rows, I got intermediate results.