当 MYSQL 中输入大约 400 万条记录时,浏览器崩溃

发布于 2024-08-03 11:19:39 字数 1173 浏览 2 评论 0原文

我下载了一个导出为 TXT 格式的数据库,大小约为 700MB,包含 700 万条记录(每行 1 条)。 我编写了一个脚本将数据导入到mysql数据库中,但是当插入大约400万条记录时,浏览器崩溃了。 我已经在 Firefox 和 IE 中进行了测试。 有人可以给我对此的意见和建议吗?

脚本是这样的:

<?php
set_time_limit(0);
ini_set('memory_limit','128M');

$conexao = mysql_connect("localhost","root","") or die (mysql_error());
$base = mysql_select_db("lista102",$conexao) or die (mysql_error());
$ponteiro = fopen("TELEFONES_1.txt","r");
$conta = 0;
function myflush(){ ob_flush(); flush(); }

while(!feof($ponteiro)){
    $conta++;

    $linha = fgets($ponteiro,4096);
    $linha = str_replace("\"", "", $linha);
    $arr = explode(";",$linha);
    $sql = "insert into usuarios (CPF_CNPJ,NOME,LOG,END,NUM,COMPL,BAIR,MUN,CEP,DDD,TELEF) values ('".$arr[0]."','".$arr[1]."','".$arr[2]."','".$arr[3]."','".$arr[4]."','".$arr[5]."','".$arr[6]."','".$arr[7]."','".$arr[8]."','".$arr[9]."','".trim($arr[10])."')";
    $rs = mysql_query($sql);
    if(!$rs){ echo $conta ." error";}

    if(($conta%5000)==4999) { sleep(10); echo "<br>Pause: ".$conta; }
    myflush();
}

echo "<BR>Eof, import complete";
fclose($ponteiro);
mysql_close($conexao);
?>

I downloaded a database that was exported to the TXT format and has about 700MB with 7 million records (1 per line).
I made a script to import the data to a mysql database, but when about 4 million records inserted into, the browser crashes.
I have tested in Firefox and IE.
Can someone give me an opinion and some advice about this?

The script is this:

<?php
set_time_limit(0);
ini_set('memory_limit','128M');

$conexao = mysql_connect("localhost","root","") or die (mysql_error());
$base = mysql_select_db("lista102",$conexao) or die (mysql_error());
$ponteiro = fopen("TELEFONES_1.txt","r");
$conta = 0;
function myflush(){ ob_flush(); flush(); }

while(!feof($ponteiro)){
    $conta++;

    $linha = fgets($ponteiro,4096);
    $linha = str_replace("\"", "", $linha);
    $arr = explode(";",$linha);
    $sql = "insert into usuarios (CPF_CNPJ,NOME,LOG,END,NUM,COMPL,BAIR,MUN,CEP,DDD,TELEF) values ('".$arr[0]."','".$arr[1]."','".$arr[2]."','".$arr[3]."','".$arr[4]."','".$arr[5]."','".$arr[6]."','".$arr[7]."','".$arr[8]."','".$arr[9]."','".trim($arr[10])."')";
    $rs = mysql_query($sql);
    if(!$rs){ echo $conta ." error";}

    if(($conta%5000)==4999) { sleep(10); echo "<br>Pause: ".$conta; }
    myflush();
}

echo "<BR>Eof, import complete";
fclose($ponteiro);
mysql_close($conexao);
?>

如果你对这篇内容有疑问,欢迎到本站社区发帖提问 参与讨论,获取更多帮助,或者扫码二维码加入 Web 技术交流群。

扫码二维码加入Web技术交流群

发布评论

需要 登录 才能够评论, 你可以免费 注册 一个本站的账号。

评论(7

紫竹語嫣☆ 2024-08-10 11:19:39

尝试将文件拆分为 100 MB 的块。这是完成工作的快速解决建议。浏览器问题的解决可能会变得很复杂。也尝试不同的浏览器。

如果发生崩溃,phpMyadmin 可以选择继续查询。如果脚本检测到接近时间限制,则允许中断导入。这可能是导入大文件的好方法,但它可能会破坏事务。

Try splitting the file in 100 MB chunks. This is a quick solving suggestion to get the job done. The browser issue can get complicated to solve. Try also different browsers.

phpMyadmin has options to continue the query if a crash happened. Allows interrupt of import in case script detects it is close to time limit. This might be good way to import large files, however it can break transactions.

迟到的我 2024-08-10 11:19:39

我不知道为什么你需要一个网络浏览器来将记录插入到 mysql 中。为什么不直接使用数据库本身的导入功能而将网络排除在外呢?

如果这是不可能的,我想知道一次将插入分成 1000 个组是否会有帮助。我建议不要将整个数据库作为单个事务提交,而是将其分解。

你用的是InnoDB吗?

I'm not sure why you need a web browser to insert records into mysql. Why not just use the import facilities of the database itself and leave the web out of it?

If that's not possible, I'd wonder if chunking the inserts into groups of 1000 at a time would help. Rather than committing the entire database as a single transaction, I'd recommend breaking it up.

Are you using InnoDB?

未央 2024-08-10 11:19:39
  1. 我首先注意到的是您不安全地使用flush()。当httpd 缓冲区已满时执行flush() 会导致错误并且脚本终止。放弃所有这些 myflush() 解决方法并使用单个 ob_implicit_flush()相反。

  2. 您不需要使用浏览器查看它即可使其工作到最后,您可以放置​​一个 ignore_user_abort() 因此,即使您的浏览器死机,您的代码也将完成其工作。

  3. 不确定您的浏览器为何死机。也许您的脚本生成了太多内容。

  1. What I've first noticed is that you are using flush() unsafely. Doing flush() when the httpd buffer is full result in an error and your script dies. Give up all this myflush() workaround and use a single ob_implicit_flush() instead.

  2. You don't need to be seeing it with your browser to make it work to the end, you can place a ignore_user_abort() so your code shall complete its job even if your browser dies.

  3. Not sure why your browser is dying. Maybe your script is generating too much content.

乙白 2024-08-10 11:19:39

尝试不使用

暂停:nnnn

输出到浏览器,看看是否有帮助。这可能只是因为浏览器在要求呈现的长网页上感到窒息。

另外,PHP在长传输期间是否超时?

而且,您的 sleep(10) 增加了所需的时间,这也没有帮助。

Try it with no

<br> Pause: nnnn

output to the browser, and see if that helps. It may be simply that the browser is choking on the long web page it's asked to render.

Also, is PHP timing out during the long transfer?

It doesn't help, also, that you have sleep(10) adding to the time it takes.

铁憨憨 2024-08-10 11:19:39

您可以尝试将文件拆分为不同的 TXT 文件,然后使用这些文件重做该过程。我知道我至少用过一次这种方法。

You can try splitting up the file in different TXT files, and redo the process using the files. I know I at least used that approach once.

无敌元气妹 2024-08-10 11:19:39

由于请求完成时间过长,浏览器出现阻塞。这个过程应该成为网页的一部分吗?如果您绝对必须这样做,请考虑将数据拆分为可管理的块。

The browser is choking because the request is taking too long to complete. Is there a reason this process should be part of a web page? If you absolutely have to do it this way, consider splitting up your data in manageable chunks.

困倦 2024-08-10 11:19:39

使用 PHP-CLI 在命令行中运行代码。这样,您将永远不会遇到长时间运行的进程超时的情况。不过,情况是你的浏览器在超时之前崩溃了^^。
如果您尝试在没有 shell 访问权限的托管服务器中执行,请使用 crontab 运行代码。但是,您必须确保 crontab 仅运行一次

Run your code in command line using PHP-CLI. This way, you will never encounter time-out for long running process. Although, the situation is your browser crash before time-out ^^.
If you try to execute in hosting server which you don't have shell access, run the code using crontab. But, you have to make sure that the crontab only run once!

~没有更多了~
我们使用 Cookies 和其他技术来定制您的体验包括您的登录状态等。通过阅读我们的 隐私政策 了解更多相关信息。 单击 接受 或继续使用网站,即表示您同意使用 Cookies 和您的相关数据。
原文