大表上的 MySQL 更新
我想更新一个有 9000 万条记录的表。
例如:
UPDATE huge_table set field3 = CONCAT_WS(' ', field1, field2)
更新将表锁定一段时间,我想最小化锁定的时间。
我是否应该填充临时表来存储值,然后从临时表中分配它们?或者我应该尝试分批更新 1000 个?也许是两者的结合。
I want to update a table with 90 million records.
For example:
UPDATE huge_table set field3 = CONCAT_WS(' ', field1, field2)
The update locks the table for a while and I want to minimize the amount of time locked.
Should I populate a temp table to store the values, then assign them from the temp table? Or should I try to update in batches of 1000? Perhaps a combination of both.
如果你对这篇内容有疑问,欢迎到本站社区发帖提问 参与讨论,获取更多帮助,或者扫码二维码加入 Web 技术交流群。

绑定邮箱获取回复消息
由于您还没有绑定你的真实邮箱,如果其他用户或者作者回复了您的评论,将不能在第一时间通知您!
发布评论
评论(3)
我会分批运行更新。
I would run the update in batches.
您应该在不同的环境中执行类似的加载,并将表重新指向新的数据集。
You should do a load like that in a different environment and re-point the table instead to the fresh data set.
两者都没有。您需要尽快更新。如果您创建临时表,那么您将添加 9000 万次插入和删除。
考虑表分区。请参阅http://dev.mysql.com/doc/refman/ 5.5/en/partitioning-overview.html。
None of both. You need the update to be as fast as possible. If you create a temporary table then you will be adding 90 million inserts and deletes.
Consider table partitioning. See http://dev.mysql.com/doc/refman/5.5/en/partitioning-overview.html.