很长的 PHP 脚本的潜在问题
我有一个每天运行一次的 PHP 脚本,运行需要 30 分钟(我认为)。那里的一切都有一个保险箱和一个保险箱。安全运行。大约 10~15 分钟后我不断收到 500 错误。但是我在日志等中看不到任何内容,所以我有点困惑。
到目前为止,我设置为“无限制”的内容是:
- max_execution_time
- max_input_time
- default_socket_timeout
还为此部分(脚本运行的文件夹)将这些设置为高得离谱的数字
- memory_limit
- post_max_size
该脚本的本质是导入的 SOAP 类型 API来自第 3 方 URL 的数千行数据,将它们放入本地 MySQL 表中,然后下载每行附带的图像,因此数据量是重要的。
我试图找出我缺少的其他 PHP 变量等,以便完成整个事情。我设置的其他 PHP 变量:
- display_errors = On
- log_errors = On
- error_reporting = E_ALL & ~E_NOTICE & ~E_WARNING
- 错误日志 = "错误日志"
I have a PHP script that runs once a day, and it takes a good 30 minutes to run (I think). Everything there is a safe & secure operation. I keep getting the 500 error after about 10~15 minutes of it. However I can't see anything in the logs etc. so I'm a bit confused.
So far the things I set up as "unlimited" are:
- max_execution_time
- max_input_time
- default_socket_timeout
Also set these to obscenely high numbers just for this section (the folder in which the script runs)
- memory_limit
- post_max_size
The nature of this script is a SOAP type API that imports thousands of rows of data from a 3rd party URL, puts them into a local MySQL table, and then downloads images attached with each and every row, so the amount of data is significant.
I'm trying to figure out what other PHP variables etc. I'm missing in order to get this to complete through the whole thing. Other PHP vars I have set:
- display_errors = On
- log_errors = On
- error_reporting = E_ALL & ~E_NOTICE & ~E_WARNING
- error_log = "error_log"
如果你对这篇内容有疑问,欢迎到本站社区发帖提问 参与讨论,获取更多帮助,或者扫码二维码加入 Web 技术交流群。
绑定邮箱获取回复消息
由于您还没有绑定你的真实邮箱,如果其他用户或者作者回复了您的评论,将不能在第一时间通知您!
发布评论
评论(4)
超时分为三种:
在您的情况下,Apache 似乎已超时。在这种情况下,最好使用 PHP CLI。但如果你确实需要实时进行这个操作。然后您可以使用Gearman,通过它您将在PHP中实现真正的并行性。
如果您需要从普通 HTTP 请求(浏览器 -> Apache)触发脚本的简单解决方案,您可以从 PHP 作为 shell 命令运行后端脚本(CLI 脚本),但“异步”。更多信息可以在 PHP 中的异步 shell 执行 中找到
There are three timeouts:
In your case seems like the Apache reached its timeout. In such situation it is better to use PHP CLI. But if you really need to do this operation in real-time. Then you can make use of Gearman through which you will achieve true parallelism in PHP.
If you need simple solution that trigger your script from normal HTTP request (Browser->Apache), you can run your back-end script (CLI script) as shell command from PHP but 'asynchronously'. More info can be found in Asynchronous shell exec in PHP
尝试使用 PHP 命令行界面 (php-cli) 来完成冗长的任务。命令行中的执行时间是无限的,除非您设置/终止它。您也可以通过 cron 作业设置计划。
Try to use PHP Command-Line Interface (php-cli) to do lengthy task. Execution time is infinity in command line unless you set it / terminate it. Also you can set schedule by cron job.
使用 PHP 从命令行运行它(例如
php yourscript.php
),并且不会发生此错误。另外,使用set_time_limit(0)
也不是一个好主意;您最多应该使用set_time_limit(86400)
。您可以设置一个 cron 作业每天执行一次。只需确保脚本中的所有文件路径都是绝对的而不是相对的,这样就不会混淆。编译脚本也可能有帮助。 HipHop 是一个很棒的 PHP 编译器,那么你的脚本将运行得更快,使用更少的内存,并且可以使用尽可能多的资源。 HipHop 的安装非常困难。
Run it from command line with PHP (e.g.
php yourscript.php
) and this error shouldn't occur. Also it's not a good idea to useset_time_limit(0)
; you should at most useset_time_limit(86400)
. You can set a cron job to do this once per day. Just make sure that all filepaths in the script are absolute and not relative so it doesn't get confused.Compiling the script might also help. HipHop is a great PHP compiler, then your script will run faster, use less memory, and can use as many resources as it likes. HipHop is just very difficult to install.
如果执行时间是个问题,那么也许您应该使用 设置 max_execution 时间脚本中的 set_time_limit 函数:
我还会直接使用 php 在命令行上调用脚本,而不是通过 apache。此外,打印出一些状态消息,并将它们通过管道传输到日志中。
我怀疑你的实际问题是脚本在某个地方被坏数据阻塞了。
If the execution time is a problem, then maybe you should set the max_execution time using set_time_limit function inside the script:
I would also invoke the script on the command line using php directly, instead of through apache. In addition, print out some status messages, and pipe them into a log.
I suspect that your actual problem is that the script chokes on bad data somewhere along the line.