通过 php 输入处理上传时减少 PHP 中的内存消耗
我正在运行 nginx 1.0.5 + php-cgi (PHP 5.3.6)。 我需要上传 ~1GB 文件(必须有 1-5 个并行上传)。 我试图通过ajax上传创建大文件的上传。一切正常,但 PHP 每次上传都会占用大量内存。我已设置 memory_limit = 200M,但上传文件的大小可达约 150MB。如果文件较大 - 上传失败。我可以将内存限制设置得越来越大,但我认为这是错误的方式,因为 PHP 可以吃掉所有内存。 我使用这个 PHP 代码(经过简化)来处理服务器端的上传:
$input = fopen('php://input', 'rb');
$file = fopen('/tmp/' . $_GET['file'] . microtime(), 'wb');
while (!feof($input)) {
fwrite($file, fread($input, 102400));
}
fclose($input);
fclose($file);
/etc/nginx/nginx.conf:
user www-data;
worker_processes 100;
pid /var/run/nginx.pid;
events {
worker_connections 768;
# multi_accept on;
}
http {
##
# Basic Settings
##
sendfile on;
tcp_nopush on;
tcp_nodelay on;
keepalive_timeout 65;
types_hash_max_size 2048;
client_max_body_size 2g;
# server_tokens off;
server_names_hash_max_size 2048;
server_names_hash_bucket_size 128;
# server_names_hash_bucket_size 64;
# server_name_in_redirect off;
include /etc/nginx/mime.types;
default_type application/octet-stream;
##
# Logging Settings
##
access_log /var/log/nginx/access.log;
error_log /var/log/nginx/error.log;
##
# Gzip Settings
##
gzip on;
gzip_disable "msie6";
include /etc/nginx/conf.d/*.conf;
include /etc/nginx/sites-enabled/*;
}
/etc/nginx/sites-enabled/srv.conf:
server {
listen 80;
server_name srv.project.loc;
# Define root
set $fs_webroot "/home/andser/public_html/project/srv";
root $fs_webroot;
index index.php;
# robots.txt
location = /robots.txt {
alias $fs_webroot/deny.robots.txt;
}
# Domain root
location / {
if ($request_method = OPTIONS ) {
add_header Access-Control-Allow-Origin "http://project.loc";
add_header Access-Control-Allow-Methods "GET, OPTIONS, POST";
add_header Access-Control-Allow-Headers "Authorization,X-Requested-With,X-File-Name,Content-Type";
#add_header Access-Control-Allow-Headers "*";
add_header Access-Control-Allow-Credentials "true";
add_header Access-Control-Max-Age "10000";
add_header Content-Length 0;
add_header Content-Type text/plain;
return 200;
}
try_files $uri $uri/ /index.php?$query_string;
}
#error_page 404 /404.htm
location ~ index.php {
fastcgi_pass 127.0.0.1:9000;
fastcgi_index index.php;
fastcgi_param SCRIPT_FILENAME $fs_webroot/$fastcgi_script_name;
include fastcgi_params;
fastcgi_param REQUEST_METHOD $request_method;
fastcgi_param PATH_INFO $fastcgi_script_name;
add_header Pragma no-cache;
add_header Cache-Control no-cache,must-revalidate;
add_header Access-Control-Allow-Origin *;
#add_header Access-Control-Allow-Headers "Content-Type, X-Requested-With, X-File-Name";
}
}
有人知道减少 PHP 内存消耗的方法吗? 谢谢。
I have nginx 1.0.5 + php-cgi (PHP 5.3.6) running.
I need to upload ~1GB files (1-5 parallel uploads must be).
I trying to create uploading of big files through ajax upload. Everything is working but PHP eating a lot of memory for each upload. I have set memory_limit = 200M, but it's working up to ~150MB size of uploaded file. If file is bigger - uploading fails. I can set memory_limit bigger and bigger, but I think it's wrong way, cause PHP can eat all memory.
I use this PHP code (it's simplified) to handle uploads on server side:
$input = fopen('php://input', 'rb');
$file = fopen('/tmp/' . $_GET['file'] . microtime(), 'wb');
while (!feof($input)) {
fwrite($file, fread($input, 102400));
}
fclose($input);
fclose($file);
/etc/nginx/nginx.conf:
user www-data;
worker_processes 100;
pid /var/run/nginx.pid;
events {
worker_connections 768;
# multi_accept on;
}
http {
##
# Basic Settings
##
sendfile on;
tcp_nopush on;
tcp_nodelay on;
keepalive_timeout 65;
types_hash_max_size 2048;
client_max_body_size 2g;
# server_tokens off;
server_names_hash_max_size 2048;
server_names_hash_bucket_size 128;
# server_names_hash_bucket_size 64;
# server_name_in_redirect off;
include /etc/nginx/mime.types;
default_type application/octet-stream;
##
# Logging Settings
##
access_log /var/log/nginx/access.log;
error_log /var/log/nginx/error.log;
##
# Gzip Settings
##
gzip on;
gzip_disable "msie6";
include /etc/nginx/conf.d/*.conf;
include /etc/nginx/sites-enabled/*;
}
/etc/nginx/sites-enabled/srv.conf:
server {
listen 80;
server_name srv.project.loc;
# Define root
set $fs_webroot "/home/andser/public_html/project/srv";
root $fs_webroot;
index index.php;
# robots.txt
location = /robots.txt {
alias $fs_webroot/deny.robots.txt;
}
# Domain root
location / {
if ($request_method = OPTIONS ) {
add_header Access-Control-Allow-Origin "http://project.loc";
add_header Access-Control-Allow-Methods "GET, OPTIONS, POST";
add_header Access-Control-Allow-Headers "Authorization,X-Requested-With,X-File-Name,Content-Type";
#add_header Access-Control-Allow-Headers "*";
add_header Access-Control-Allow-Credentials "true";
add_header Access-Control-Max-Age "10000";
add_header Content-Length 0;
add_header Content-Type text/plain;
return 200;
}
try_files $uri $uri/ /index.php?$query_string;
}
#error_page 404 /404.htm
location ~ index.php {
fastcgi_pass 127.0.0.1:9000;
fastcgi_index index.php;
fastcgi_param SCRIPT_FILENAME $fs_webroot/$fastcgi_script_name;
include fastcgi_params;
fastcgi_param REQUEST_METHOD $request_method;
fastcgi_param PATH_INFO $fastcgi_script_name;
add_header Pragma no-cache;
add_header Cache-Control no-cache,must-revalidate;
add_header Access-Control-Allow-Origin *;
#add_header Access-Control-Allow-Headers "Content-Type, X-Requested-With, X-File-Name";
}
}
Anybody knows the way to reduce memory consumption by PHP?
Thanks.
如果你对这篇内容有疑问,欢迎到本站社区发帖提问 参与讨论,获取更多帮助,或者扫码二维码加入 Web 技术交流群。
绑定邮箱获取回复消息
由于您还没有绑定你的真实邮箱,如果其他用户或者作者回复了您的评论,将不能在第一时间通知您!
发布评论
评论(3)
有一个 hack,它是关于伪造内容类型标头,将其从
application/octet-stream
转换为multipart/form-data
。它将阻止 PHP 填充 $HTTP_RAW_POST_DATA。更多详情https://github.com/valums/file-uploader/issues/61。There's a hack, which is about faking content type header, turning it from
application/octet-stream
tomultipart/form-data
. It will stop PHP from populating $HTTP_RAW_POST_DATA. More details https://github.com/valums/file-uploader/issues/61.以前也遇到过同样的情况,这就是我在上传过程中将文件分成不同块的方法。
我的一个很好的例子是使用[1]: http://www.plupload.com/index.php“pulpload”或尝试使用 java 小程序 http://jupload.sourceforge.net,它也具有恢复功能有网络问题等等。
最重要的是是您希望通过网络浏览器上传文件,但有注意到阻止您分块上传文件
Have been in the same shoe before and this is what i did split the files into different chunks during the upload process.
I good example is using [1]: http://www.plupload.com/index.php "pulpload" or trying using a java applet http://jupload.sourceforge.net which also has resume capability when there are network issues etc.
The most important thing is that you want your files uploaded via a web browser there is noting stopping you from doing so in chunks
为什么不尝试使用 Flash 上传大文件呢?例如,您可以尝试
swfupload
,它对PHP。Why don't you try using flash to upload huge files. For example, you can try
swfupload
, which has good support for PHP.