Dreamhost 上通过 PHP 下载的文件被神秘中断
我编写了一个简单的 PHP 脚本,用于在用户具有正确身份验证的情况下下载隐藏文件。整个设置工作正常:它发送正确的标头,并且文件传输开始得很好(并且结束得很好 - 对于小文件)。
然而,当我尝试提供 150 MB 的文件时,连接在靠近文件中间的地方神秘地中断。这是相关的代码片段(取自互联网上的某个地方并由我改编):
function readfile_chunked($filename, $retbytes = TRUE) {
$handle = fopen($filename, 'rb');
if ($handle === false) return false;
while (!feof($handle) and (connection_status()==0)) {
print(fread($handle, 1024*1024));
set_time_limit(0);
ob_flush();
flush();
}
return fclose($handle);
}
在调用上面的函数之前,我还做了一些其他代码,以尝试解决问题,但据我所知,它什么也没做
session_write_close();
ob_end_clean();
ignore_user_abort();
set_time_limit(0);
:您可以看到,它不会尝试立即将整个文件加载到内存中或类似的疯狂行为。更令人费解的是,它杀死它的传输中的实际点似乎在 50 到 110 MB 之间浮动,并且它似乎在几秒钟内杀死了同一文件的所有连接(尝试这样做与朋友同时下载)。中断的文件中没有附加任何内容,并且我在日志中没有看到任何错误。
我正在使用 Dreamhost,因此我怀疑他们的看门狗可能会杀死我的进程,因为它已经运行了太长时间。有没有人有关于这件事的经验可以分享?还会有其他问题吗?有什么解决方法吗?
根据记录,我的 Dreamhost 设置为使用 PHP 5.2.1 FastCGI。
I've written a simple PHP script to download a hidden file if the user has proper authentication. The whole set up works fine: it sends the proper headers, and the file transfer begins just fine (and ends just fine - for small files).
However, when I try to serve a 150 MB file, the connection gets mysteriously interrupted somewhere close to the middle of the file. Here's the relevant code fragment (taken from somewhere on the Internet and adapted by me):
function readfile_chunked($filename, $retbytes = TRUE) {
$handle = fopen($filename, 'rb');
if ($handle === false) return false;
while (!feof($handle) and (connection_status()==0)) {
print(fread($handle, 1024*1024));
set_time_limit(0);
ob_flush();
flush();
}
return fclose($handle);
}
I also do some other code BEFORE calling that function above, to try to solve the issue, but as far as I can tell, it does nothing:
session_write_close();
ob_end_clean();
ignore_user_abort();
set_time_limit(0);
As you can see, it doesn't attempt to load the whole file in memory at once or anything insane like that. To make it even more puzzling, the actual point in the transfer where it kills it seems to float between 50 and 110 MB, and it seems to kill ALL connections to the same file within a few seconds of each other (tried this by trying to download simultaneously with a friend). Nothing is appended to the interrupted file, and I see no errors on the logs.
I'm using Dreamhost, so I suspect that their watchdog might be killing my process because it's been running for too long. Does anyone have any experience to share on the matter? Could something else be the issue? Is there any workaround?
For the record, my Dreamhost is setup to use PHP 5.2.1 FastCGI.
如果你对这篇内容有疑问,欢迎到本站社区发帖提问 参与讨论,获取更多帮助,或者扫码二维码加入 Web 技术交流群。
绑定邮箱获取回复消息
由于您还没有绑定你的真实邮箱,如果其他用户或者作者回复了您的评论,将不能在第一时间通知您!
发布评论
评论(1)
我对 Dreamhost 的经验很少,但您可以使用 mod_xsendilfe 代替(如果 Dreamhost 允许的话)。
I have little experience with Dreamhost, but you could use mod_xsendilfe instead (if Dreamhost allows it).