导入csv超时问题

发布于 2024-12-15 22:16:32 字数 765 浏览 0 评论 0原文

我正在尝试修改 csv 导入函数,该函数在导入 60 秒后超时。对于每一行,都有调整大小的图像并执行一些其他代码。

我知道 vps 可以处理这个问题,但是是批量处理,因为我在同一台服务器上有另一个网站,它运行不同的 csv 程序,但做同样的事情。该程序可以导入 8000 行并调整图像大小。设置为:处理 10 行并等待 3 秒,重复。

我提出的设置:

  • set_time_limit
  • max_execution_time
  • 浏览器http保持活动超时

我已尝试每10行sleep(),但这只会使进程导入更少的行

if( (($current_line % 10) == 0) && ($current_line != 0) )
{
  sleep(3);
}

这就是脚本在文件中循环的方式

for ($current_line = 0; $line = fgetcsv($handle, MAX_LINE_SIZE, Tools::getValue('separator')); $current_line++)
{
//code here
}

服务器:

  • Apache
  • PHP 5.3.3
  • MYSQL
  • Varnish 缓存

我该怎么做才能使其工作?

I'm trying to modify a csv import function which times out after 60 sec of importing. For every line there are images that are resized and some other code is executed.

I know the vps can handle this but in batches because I have another website on the same server that runs a different csv program but does the same thing. That program can import 8000 lines and resize images as well. The settings there are: process 10 lines and wait 3 sec, repeat.

Settings I raised:

  • set_time_limit
  • max_execution_time
  • Browser http keep alive timeout

I have tried sleep() for every 10th line but this only makes the process import fewer lines

if( (($current_line % 10) == 0) && ($current_line != 0) )
{
  sleep(3);
}

This is how the script loops through the file

for ($current_line = 0; $line = fgetcsv($handle, MAX_LINE_SIZE, Tools::getValue('separator')); $current_line++)
{
//code here
}

Server:

  • Apache
  • PHP 5.3.3
  • MYSQL
  • Varnish cache

What can I do to make this work?

如果你对这篇内容有疑问,欢迎到本站社区发帖提问 参与讨论,获取更多帮助,或者扫码二维码加入 Web 技术交流群。

扫码二维码加入Web技术交流群

发布评论

需要 登录 才能够评论, 你可以免费 注册 一个本站的账号。

评论(3

星軌x 2024-12-22 22:16:32

当脚本超时时,要尝试的第一件事是使用 php 运行它-cli。通过命令行运行的脚本没有执行时间限制。

如果这不能解决您的问题,那么您就知道这不是执行时间限制。

要尝试的第二件事是打印常规状态消息,包括来自 memory_get_usage( ) 这样您就可以消除导致脚本崩溃的内存泄漏问题。这可以帮助您确定脚本是否因某些输入而终止。

The first thing to try when your script times out is to run it using the php-cli. There is no execution time limit to scripts that are run through the command line.

If this doesn't solve your problem, then you know it wasn't the execution time limit.

The second thing to try is to print out regular status messages, including from memory_get_usage() so that you can eliminate memory leaks as a cause for your script crash. This may help you identify whether your script was dying on some input.

峩卟喜欢 2024-12-22 22:16:32

您可以覆盖默认超时时间。
设置时间限制(0);

使用 sleep 会使其导入更少的行,基本上脚本会超时,因为它花费了 60 秒以上的时间。通过增加睡眠,60 秒内完成的工作量只会减少。

如果这是一个关键脚本,我会考虑将其转移到另一种可以更快执行的编程语言。如果它只是一次性的,或者不是关键任务,请尝试 set_time_limit(0) 这使得它永远不会超时。还可以尝试在浏览器中输入 php scriptname 以在命令行中运行脚本。

you can over-write the default timeout time.
set_time_limit (0) ;

using sleep will make it import fewer lines, basically the script is timeing out because it is taking over 60 seconds. by adding sleep it just gets less done in 60 seconds.

If this is a critical script i'd look at moving it to another programing language that can execute this faster. if its just a one off, or not mission critical try set_time_limit(0) which makes it never time out. also try typing php scriptname into the browser to run the script in command line.

人疚 2024-12-22 22:16:32

尝试向浏览器输出一些内容以保持浏览器处于活动状态。如果不活动 FF 为 3 分钟,则 IE 将在 1 分钟后超时。

<?
if( (($current_line % 10) == 0) && ($current_line != 0) )
{
  sleep(3);
  echo '. ';
}
?>

Try outputting something to the browser to keep the browser alive. IE times out after 1 minute if inactivity FF is 3 minutes.

<?
if( (($current_line % 10) == 0) && ($current_line != 0) )
{
  sleep(3);
  echo '. ';
}
?>
~没有更多了~
我们使用 Cookies 和其他技术来定制您的体验包括您的登录状态等。通过阅读我们的 隐私政策 了解更多相关信息。 单击 接受 或继续使用网站,即表示您同意使用 Cookies 和您的相关数据。
原文