php 中多次调用 mysql_query 后出现 500 错误
我有一个 php 脚本,它遍历包含制表符分隔文件的文件夹,逐行解析它们并将数据插入 mysql 数据库。由于我的服务器上的安全限制,我无法使用 LOAD TABLE,并且我无权访问配置文件。该脚本可以很好地解析 1 或 2 个较小的文件,但是当处理多个大文件时,我会收到 500 错误。似乎没有任何包含与错误相关的消息的错误日志,至少我的托管提供商不允许我访问。下面是代码,我也愿意接受关于做我需要做的事情的替代方法的建议。最终,我希望该脚本每 30 分钟左右启动一次,插入新数据并在完成后删除文件。
编辑:进行菲尔建议的更改后,脚本仍然失败,但我现在的错误日志中出现以下消息“mod_fcgid:120秒内读取数据超时”,看起来脚本超时,知道我可以在哪里更改超时设置?
$folder = opendir($dir);
while (($file = readdir($folder)) !== false) {
$filepath = $dir . "/" . $file;
//If it is a file and ends in txt, parse it and insert the records into the db
if (is_file($filepath) && substr($filepath, strlen($filepath) - 3) == "txt") {
uploadDataToDB($filepath, $connection);
}
}
function uploadDataToDB($filepath, $connection) {
ini_set('display_errors', 'On');
error_reporting(E_ALL);
ini_set('max_execution_time', 300);
$insertString = "INSERT INTO dirty_products values(";
$count = 1;
$file = @fopen($filepath, "r");
while (($line = fgets($file)) !== false) {
$values = "";
$valueArray = explode("\t", $line);
foreach ($valueArray as $value) {
//Escape single quotes
$value = str_replace("'", "\'", $value);
if ($values != "")
$values = $values . ",'" . $value . "'";
else
$values = "'" . $value . "'";
}
mysql_query($insertString . $values . ")", $connection);
$count++;
}
fclose($file);
echo "Count: " . $count . "</p>";
}
I have a php script that steps through a folder containing tab delimited files, parsing them line by line and inserting the data into a mysql database. I cannot use LOAD TABLE because of security restrictions on my server and I do not have access to the configuration files. The script works just fine parsing 1 or 2 smaller files but when when working with several large files I get a 500 error. There do not appear to be any error logs containing messages pertaining to the error, at least none that my hosting provider gives me access to. Below is the code, I am also open to suggestions for alternate ways of doing what I need to do. Ultimately I want this script to fire off every 30 minutes or so, inserting new data and deleting the files when finished.
EDIT: After making the changes Phil suggested, the script still fails but I now have the following message in my error log "mod_fcgid: read data timeout in 120 seconds", looks like the script is timing out, any idea where I can change the timeout setting?
$folder = opendir($dir);
while (($file = readdir($folder)) !== false) {
$filepath = $dir . "/" . $file;
//If it is a file and ends in txt, parse it and insert the records into the db
if (is_file($filepath) && substr($filepath, strlen($filepath) - 3) == "txt") {
uploadDataToDB($filepath, $connection);
}
}
function uploadDataToDB($filepath, $connection) {
ini_set('display_errors', 'On');
error_reporting(E_ALL);
ini_set('max_execution_time', 300);
$insertString = "INSERT INTO dirty_products values(";
$count = 1;
$file = @fopen($filepath, "r");
while (($line = fgets($file)) !== false) {
$values = "";
$valueArray = explode("\t", $line);
foreach ($valueArray as $value) {
//Escape single quotes
$value = str_replace("'", "\'", $value);
if ($values != "")
$values = $values . ",'" . $value . "'";
else
$values = "'" . $value . "'";
}
mysql_query($insertString . $values . ")", $connection);
$count++;
}
fclose($file);
echo "Count: " . $count . "</p>";
}
如果你对这篇内容有疑问,欢迎到本站社区发帖提问 参与讨论,获取更多帮助,或者扫码二维码加入 Web 技术交流群。
绑定邮箱获取回复消息
由于您还没有绑定你的真实邮箱,如果其他用户或者作者回复了您的评论,将不能在第一时间通知您!
发布评论
评论(1)
我要做的第一件事是使用准备好的语句(使用 PDO)。
使用
mysql_query()
函数,您将为每个插入创建一个新语句,并且可能会超出允许的限制。如果您使用预准备语句,则只会在数据库服务器上创建和编译一条语句。
示例
考虑到您希望按计划运行此脚本,我将完全避免使用 Web 服务器,并使用 cron 或您的主机提供的任何计划服务通过 CLI 运行该脚本。这将帮助您避免 Web 服务器中配置的任何超时。
First thing I'd do is use prepared statements (using PDO).
Using the
mysql_query()
function, you're creating a new statement for every insert and you may be exceeding the allowed limit.If you use a prepared statement, only one statement is created and compiled on the database server.
Example
Considering you want to run this script on a schedule, I'd avoid the web server entirely and run the script via the CLI using cron or whatever scheduling service your host provides. This will help you avoid any timeout configured in the web server.