上传到亚马逊S3超过最大执行时间
我有一个相当简单的问题和一个非常烦人的错误。我想允许一些用户上传我将存储在 Amazon S3 上的图像。我编写了一个上传脚本,当我向它提供小图像时,该脚本可以正常工作,但是当图像大到约 1mb 时,脚本就会停止。
我认为这与等待亚马逊响应的脚本有关,然后由于图像已上传而超时,但上传脚本的其余部分被跳过(插入数据库)。
我遇到了这个问题 如何PHP 中的序列事件用于将文件上传到亚马逊 S3 这与我的问题有些相似,但这更简单(我希望)。
我使用 Jumploader 和 amazon s3 类来实现 php。
这是脚本停止并不再继续的行
S3::putObject($full, 'bucket_name', $path, S3::ACL_PRIVATE)
是否有某种方法可以实例化从我的服务器到 S3 的上传并执行其余代码(因此上传是异步的)?
I have a fairly simple problem and a very annoying error. I want to allow some users to upload images which I will store on Amazon S3. I have written an uploadscript that works fine when I feed it small images, but when the images are large ~1mb, the script stops.
I think it has something to do with the script waiting for response from amazon and then times out because the image gets uploaded, but the rest of the uploadscript is skipped (inserting in DB).
I have come upon this question How to sequence events in PHP for uploading files to amazon S3 which is somewhat similar to my problem but this is a bit simpler (I hope).
I use jumploader and amazon s3 class for php.
This is the line where the script stops and go no further
S3::putObject($full, 'bucket_name', $path, S3::ACL_PRIVATE)
Are there maby some way of just instantiating the upload from my server to S3 and just excecute the rest of the code (so the upload is asynchronous)?
如果你对这篇内容有疑问,欢迎到本站社区发帖提问 参与讨论,获取更多帮助,或者扫码二维码加入 Web 技术交流群。

绑定邮箱获取回复消息
由于您还没有绑定你的真实邮箱,如果其他用户或者作者回复了您的评论,将不能在第一时间通知您!
发布评论
评论(1)
增加时间限制:
set_time_limit($seconds);
http://www.php.net /set_time_limit
如果延迟太长,用户无法合理等待,请将作业添加到队列中,并使用计划任务(cron 作业)定期运行 php 脚本来执行上传操作。
Increase the time limit:
set_time_limit($seconds);
http://www.php.net/set_time_limit
If the delay is too long for the user to reasonably wait, add the job to a queue and use a scheduled task (cron job) to run a php script periodically to action uploads.