在网络应用程序中上传/下载大文件
我有一个网络应用程序,我想提供上传和下载大文件(超过 2 GB)的可能性,
这可能吗?是否存在一些可以帮助我的开源项目?
非常感谢
I have a web app and I want to offer the possibility of upload and download big files (more than 2 gigas)
Is it possible? Does exist some open Source project that can help me?
Thank you very much
如果你对这篇内容有疑问,欢迎到本站社区发帖提问 参与讨论,获取更多帮助,或者扫码二维码加入 Web 技术交流群。
绑定邮箱获取回复消息
由于您还没有绑定你的真实邮箱,如果其他用户或者作者回复了您的评论,将不能在第一时间通知您!
发布评论
评论(3)
是的,亚马逊 S3 很棒,我每天都使用它。
但是,如果您想自行托管数据,我建议 BITS - 后台智能传输服务 -
http:// en.wikipedia.org/wiki/Background_Intelligent_Transfer_Service
http://www.codeplex.com/sharpbits
Yes, amazon S3 is great, i use it every day.
If however you want to self-host the data, i suggest BITS - Background Intelligent Transfer Service-
http://en.wikipedia.org/wiki/Background_Intelligent_Transfer_Service
http://www.codeplex.com/sharpbits
澄清
正如OP在评论中所述,这个问题与J2EE和Flex有关。它与 PHP 或任何所述的 Web 服务器无关。
我原来的回答
是的,你想做的事情是可能的。
大多数人遇到的问题是PHP 设置的限制。最值得注意的是
upload_max_filesize
和
这样你的脚本就不会超时。不过,超时很棘手,因为它与客户端的上传速度有关。post_max_size
。接下来您可能需要增加 < code>max_execution_time这些设置最好根据“需要”进行设置(如果可能),而不是在核心配置(php.ini、apache 配置等)中进行设置。
除了这些(服务器端施加的)限制之外,没有任何限制您上传大文件。也许网络服务器本身也施加了一些限制。 Apache 有
LimitRequestBody
和 lighttpd有server.max-request-size
例如。另一种解决方案是编写自定义 CGI 脚本。但是,您仍然受到网络服务器施加的限制!
了解您正在使用哪种语言编写网站将有所帮助......;)
Clarification
As OP stated in a comment, this question is related to J2EE and Flex. It has nothing to do with PHP or any of the stated web servers.
My original answer
Yes, what you are trying to do is possible.
The problem that most people encounter is the limits set by PHP. Most notably these are
upload_max_filesize
andpost_max_size
. Next thing you probably need is to increase themax_execution_time
so that your script does not time out. The timeout is tricky though, as it relates to the clients upload speed.These settings are best set on an "as-needed" basis (if possible) and not in your core configuration (php.ini, apache config, ...).
Apart from these (server-side imposed) limits, there is nothing that limits you to upload large files. Maybe the web-server itself also imposes some limits. Apache has
LimitRequestBody
and lighttpd hasserver.max-request-size
for example.Another solution is to write a custom CGI script. But then you still have the limits imposed by the web server!
It would help to know which language you are writing your website in... ;)
我建议您查看 Amazon 的 S3。它可能是您正在寻找的解决方案,但如果不是,它可以为您提供如何使用 HTTP REST 请求或 SOAP+DIME Web 服务进行大文件传输的示例。
I'd recommend taking a look at Amazon's S3. It may be the solution you are looking for but if not it can provide you with an exmaple of how to do big file transfers using a HTTP REST request or SOAP+DIME web service.