允许用户从 AWS s3 或 Cloudfront 批量下载文件

发布于 2024-12-05 02:18:23 字数 415 浏览 1 评论 0 原文

我有一个网站,允许用户搜索音乐曲目并下载他们选择的 mp3。

我的服务器上有该网站,s3 上有所有 mp3,然后通过 cloudfront 分发。到目前为止,一切都很好。

客户现在希望用户能够选择多个音乐曲目,然后批量或批量下载它们,而不是一次下载一个。

通常我会将所有文件放在一个 zip 中,然后向用户提供一个指向该新 ​​zip 文件的链接以供下载。在这种情况下,由于文件位于 s3 上,因此需要我首先将所有文件从 s3 复制到我的网络服务器,将它们处理为 zip,然后从我的服务器下载。

无论如何,我可以在 s3 或 CF 上创建 zip,或者有什么方法可以将文件批处理/分组到 zip 中吗?

也许我可以设置一个 EC2 实例来处理这个问题?

我将非常感谢一些指导。

最好的

I have a website that allows users to search for music tracks and download those they they select as mp3.

I have the site on my server and all of the mp3s on s3 and then distributed via cloudfront. So far so good.

The client now wishes for users to be able to select a number of music track and then download them all in bulk or as a batch instead of 1 at a time.

Usually I would place all the files in a zip and then present the user a link to that new zip file to download. In this case, as the files are on s3 that would require I first copy all the files from s3 to my webserver process them in to a zip and then download from my server.

Is there anyway i can create a zip on s3 or CF or is there someway to batch / group files in to a zip?

Maybe i could set up an EC2 instance to handle this?

I would greatly appreciate some direction.

Best

Joe

如果你对这篇内容有疑问,欢迎到本站社区发帖提问 参与讨论,获取更多帮助,或者扫码二维码加入 Web 技术交流群。

扫码二维码加入Web技术交流群

发布评论

需要 登录 才能够评论, 你可以免费 注册 一个本站的账号。

评论(3

情绪少女 2024-12-12 02:18:23

恐怕您将无法在没有额外处理的情况下创建批次。启动 EC2 实例可能是为每个用户创建批次的一个选项

I am afraid you won't be able to create the batches w/o additional processing. firing up an EC2 instance might be an option to create a batch per user

╭⌒浅淡时光〆 2024-12-12 02:18:23

我面临着完全相同的问题。到目前为止,我唯一能找到的是亚马逊的 s3sync 工具:

https://docs.aws.amazon.com/cli/latest/reference/s3/sync.html

在我的例子中,我使用 Rails + 它的 Paperclip 插件,这意味着我没有办法轻松下载一次性删除用户的所有图像,因为文件分散在很多子目录中。

但是,如果您可以以更好的方式对用户的文件进行分组,如下所示:

/users/<ID>/images/...
/users/<ID>/songs/...

...等,那么您可以立即解决您的问题:

aws s3 sync s3://<your_bucket_name>/users/<user_id>/songs /cache/<user_id>

请记住,您必须为服务器提供正确的凭据,以便S3 CLI 工具可以在不提示输入用户名/密码的情况下运行。

这应该对你有所帮助。

此处的附加讨论:
下载整个 S3 存储桶?

I am facing the exact same problem. So far the only thing I was able to find is Amazon's s3sync tool:

https://docs.aws.amazon.com/cli/latest/reference/s3/sync.html

In my case, I am using Rails + its Paperclip addon which means that I have no way to easily download all of the user's images in one go, because the files are scattered in a lot of subdirectories.

However, if you can group your user's files in a better way, say like this:

/users/<ID>/images/...
/users/<ID>/songs/...

...etc., then you can solve your problem right away with:

aws s3 sync s3://<your_bucket_name>/users/<user_id>/songs /cache/<user_id>

Do have in mind you'll have to give your server the proper credentials so the S3 CLI tools can work without prompting for usernames/passwords.

And that should sort you.

Additional discussion here:
Downloading an entire S3 bucket?

夜吻♂芭芘 2024-12-12 02:18:23

s3 是基于单个 http 请求的。

所以答案是线程来实现同样的事情

Java api - 使用 TransferManager

http://docs.aws.amazon.com/AWSJavaSDK/latest/javadoc/com/amazonaws/services/s3/transfer/TransferManager.html

您可以通过多线程获得出色的性能。

抱歉,没有批量下载。

s3 is single http request based.

So the answer is threads to achieve the same thing

Java api - uses TransferManager

http://docs.aws.amazon.com/AWSJavaSDK/latest/javadoc/com/amazonaws/services/s3/transfer/TransferManager.html

You can get great performance with multi threads.

There is no bulk download sorry.

~没有更多了~
我们使用 Cookies 和其他技术来定制您的体验包括您的登录状态等。通过阅读我们的 隐私政策 了解更多相关信息。 单击 接受 或继续使用网站,即表示您同意使用 Cookies 和您的相关数据。
原文