php file_get_contents 在 cronjob 期间有时会失败

发布于 2024-11-04 11:19:46 字数 456 浏览 4 评论 0原文

我试图通过 cronjob 运行 php 脚本,有时(大约一半的时间)我收到以下警告:

PHP 警告:file_get_contents(http://url.com): 无法打开流:HTTP 请求失败!在/path/myfile.php第285行

之后程序继续运行这让我觉得不是超时问题或者内存问题(超时设置为10分钟,内存设置为128M),而是我的变量存储该函数调用的结果是空的。奇怪的是,我使用其他 url 参数对同一网站进行了多次其他调用,但它们从来没有出现问题。此函数调用的唯一区别是,它正在下载的文件约为 70 mb,而其他文件都在 300 kb 左右。

另外,如果我通过 SSH 连接到 Web 服务器并手动运行 php 脚本,只有当它从 cron 运行时,我永远不会收到此警告。

我也尝试过使用 cURL 而不是 file_get_contents 但后来内存不足。

谢谢,这里的任何帮助将不胜感激。

I am trying to run a php script via a cronjob and sometimes (about half the time) I get the following warning:

PHP Warning: file_get_contents(http://url.com): failed to open stream: HTTP request failed! in /path/myfile.php on line 285

The program continues to run after that which makes me think it is not a timeout problem or a memory issue (timeout is set to 10 minutes and memory to 128M), but the variable that I am storing the results of that function call in is empty. The weird part is that I am making several other calls to this same website with other url parameters and they never have a problem. The only difference with this function call is that the file it is downloading is about 70 mb while the others are all around 300 kb.

Also, I never get this warning if I SSH into the web server and run the php script manually, only when it is run from a cron.

I have also tried using cURL instead of file_get_contents but then I run out of memory.

Thanks, any help here would be appreciated.

如果你对这篇内容有疑问,欢迎到本站社区发帖提问 参与讨论,获取更多帮助,或者扫码二维码加入 Web 技术交流群。

扫码二维码加入Web技术交流群

发布评论

需要 登录 才能够评论, 你可以免费 注册 一个本站的账号。

评论(2

烟织青萝梦 2024-11-11 11:19:46

也许 URL.com 上的远程服务器有时会超时或返回特定(大)请求的错误?

我认为您不应该尝试将 70mb 存储在变量中。

您可以将 cURL 配置为直接下载到文件。类似这样的:

$file = fopen ('my.file', 'w');
$c = curl_init('http://url.com/whatever');
curl_setopt($c, CURLOPT_FILE, $file);
curl_exec($c);
curl_close($c);
fclose($file);

如果不出意外的话,curl 应该可以为您提供更好的错误信息,告诉您发生了什么问题。

Perhaps the remote server on URL.com is sometimes timing out or returning an error for that particular (large) request?

I don't think you should be trying to store 70mb in a variable.

You can configure cURL to download directly to a file. Something like:

$file = fopen ('my.file', 'w');
$c = curl_init('http://url.com/whatever');
curl_setopt($c, CURLOPT_FILE, $file);
curl_exec($c);
curl_close($c);
fclose($file);

If nothing else, curl should provide you with much better errors about what's going wrong.

2024-11-11 11:19:46

从另一个答案..仔细检查您正在使用的 URL 参数是否有时不会发生此问题:

注意:如果您要打开包含特殊字符(例如空格)的 URI,则需要使用 urlencode() 对 URI 进行编码 - http://docs.php.net/file%5Fget%5Fcontents

From another answer .. double check that this issue isn't occurring some of the time with the URL parameters you're using:

Note: If you're opening a URI with special characters, such as spaces, you need to encode the URI with urlencode() - http://docs.php.net/file%5Fget%5Fcontents

~没有更多了~
我们使用 Cookies 和其他技术来定制您的体验包括您的登录状态等。通过阅读我们的 隐私政策 了解更多相关信息。 单击 接受 或继续使用网站,即表示您同意使用 Cookies 和您的相关数据。
原文