如何将大型sql文件导入mysql表
我有一个 php 脚本,它解析 XML 文件并创建一个大的 SQL 文件,如下所示:
INSERT IGNORE INTO table(field1,field2,field3...)
VALUES ("value1","value2",int1...),
("value1","value2",int1)...etc
该文件加起来超过 20GB(我已经在 2.5GB 文件上进行了测试,但它也失败了)。
我尝试过类似的命令:
mysql -u root -p 表名
/var/www/bigfile.sql
这适用于较小的文件,例如 50MB 左右。但它不适用于较大的文件。
我尝试过:
mysql> source /var/www/bigfile.sql
我也尝试过 mysqlimport 但它甚至无法正确处理我的文件。
我不断收到一条错误消息,提示“
ERROR 2006 (HY000): MySQL server has gone away
大约发生”。我开始执行后 30 秒。
我将 allowed_max_packet 设置为 4GB,但是当使用 SHOW VARIABLES 验证它时,它只显示 1GB。
有没有办法做到这一点而不浪费另外 10 个小时?
I have a php script that parses XML files and creates a large SQL file that looks something like this:
INSERT IGNORE INTO table(field1,field2,field3...)
VALUES ("value1","value2",int1...),
("value1","value2",int1)...etc
This file adds up to be over 20GB (I've tested on a 2.5GB file but it fails too).
I've tried commands like:
mysql -u root -p table_name < /var/www/bigfile.sql
this works on smaller files, say around 50MB. but it doesn't work with a larger file.
I tried:
mysql> source /var/www/bigfile.sql
I also tried mysqlimport but that won't even properly process my file.
I keep getting an error that says
ERROR 2006 (HY000): MySQL server has gone away
Happens approx. 30 seconds after I start executing.
I set allowed_max_packet to 4GB but when verifying it with SHOW VARIABLES it only shows 1GB.
Is there a way to do this without wasting another 10 hours?
如果你对这篇内容有疑问,欢迎到本站社区发帖提问 参与讨论,获取更多帮助,或者扫码二维码加入 Web 技术交流群。

绑定邮箱获取回复消息
由于您还没有绑定你的真实邮箱,如果其他用户或者作者回复了您的评论,将不能在第一时间通知您!
发布评论
评论(1)
尝试将文件拆分为多个 INSERT 查询。
Try splitting the file into multiple INSERT queries.