MySQL批处理插入 - 大量记录
我试图在单个GO中将多个记录插入MySQL表。
仅供参考:我的技术堆栈是nodejs+mySQL( https:// /a>)
我目前正在做的是:
INSERT INTO table_name (a,b,c) VALUES(1,2,3),(4,5,6),(7,8,9);
上述解决方案的问题是,当我拥有大约50k记录时,MySQL缓冲区会爆炸。
为此,我更改了
max_allowed_packet = 1000MB
,但我敢肯定这不是一个适当的解决方案。我正在寻找最好的&在这种情况下,可以优化解决方法。
任何线索或提示都将不胜感激。谢谢
I am trying to insert multiple records to MySQL table in single go.
FYI: My tech stack is Nodejs+MySQL (https://www.npmjs.com/package/mysql)
What I am currently doing is:
INSERT INTO table_name (a,b,c) VALUES(1,2,3),(4,5,6),(7,8,9);
The problem with my above solution is that when I have around 50k records, the MySQL buffer explodes.
For that I changed
max_allowed_packet = 1000MB
But I am sure that is not a proper solution. I am looking for the best & optimized workaround possible in this scenario.
Any clues or hints would be highly appreciated. Thanks
如果你对这篇内容有疑问,欢迎到本站社区发帖提问 参与讨论,获取更多帮助,或者扫码二维码加入 Web 技术交流群。

绑定邮箱获取回复消息
由于您还没有绑定你的真实邮箱,如果其他用户或者作者回复了您的评论,将不能在第一时间通知您!
发布评论
评论(1)
由于许多原因(您遇到了其中一个),我一次将批处理插入物分解为100或1000行的团块。这将在相同速度的1%之内,同时避免很多问题。
For many reasons (you encountered one of them), I break up batch inserts into clumps of 100 or 1000 rows at a time. That will be within 1% of the same speed, while avoiding lots of problems.