是否可以使用 POST 从 URL 直接上传到 S3?
我知道有一种方法可以使用 POST 直接从 Web 浏览器上传到 S3,而无需将文件发送到后端服务器。但是有没有一种方法可以通过 URL 而不是 Web 浏览器来完成此操作。
例如,使用 post 将位于 http://example.com/dude.jpg 的文件直接上传到 S3。我的意思是我不想将资产下载到我的服务器然后将其上传到 S3。我只想向 S3 发出 POST 请求,它会自动上传。
I know there is a way to upload to S3 directly from the web browser using POST without the files going to your backend server. But is there a way to do it from URL instead of web browser.
Example, upload a file that resides at http://example.com/dude.jpg directly to S3 using post. I mean I don't want to download the asset to my server then upload it to S3. I just want to make a POST request to S3 and it uploads it automatically.
如果你对这篇内容有疑问,欢迎到本站社区发帖提问 参与讨论,获取更多帮助,或者扫码二维码加入 Web 技术交流群。
绑定邮箱获取回复消息
由于您还没有绑定你的真实邮箱,如果其他用户或者作者回复了您的评论,将不能在第一时间通知您!
发布评论
评论(6)
听起来您希望 S3 本身从远程服务器下载文件,您只需将资源的 URL 传递给 S3。
S3 目前不支持此功能。
它需要一个API客户端来将对象的内容实际传输到S3。
It sounds like you want S3 itself to download the file from a remote server where you only pass the URL of the resource to S3.
This is not currently supported by S3.
It needs an API client to actually transfer the content of the object to S3.
我想我应该分享我的代码来实现类似的目标。我正在后端工作,但可能可以在前端做类似的事情,但要注意AWS凭证可能暴露。
出于我的目的,我想从外部 URL 下载文件,然后最终取回上传文件的 S3 形式的 URL。
我还使用
axios
来获取可上传格式和file-type
获取文件的正确类型,但这是不是要求。下面是我的代码片段:
我希望它可以帮助那些仍然遇到类似问题的人。祝你好运!
I thought I should share my code to achieve something similar. I was working on the backend but possibly could do something similar in frontend though be mindful about AWS credentials likely to be exposed.
For my purposes, I wanted to download a file from the external URL and then ultimately get back the URL form S3 of the uploaded file instead.
I also used
axios
in order to get the uploadable format andfile-type
to get the proper type of the file but that is not the requirement.Below is the snippet of my code:
I hope it helps people who still struggle with similar issues. Good luck!
您可以使用 rclone 轻松实现此目的:
https://rclone.org/commands/rclone_copyurl/
在 AWS 上为 rclone 创建新的访问密钥并使用 rclone 配置,如下所示:
https://rclone.org/s3/
然后,您可以使用 rclone 轻松与 S3 存储桶进行交互。
从 URL 上传:
rclone -Pva copy {URL} RCLONE_CONFIG_NAME:/{BUCKET_NAME}/{FOLDER}/
这对我来说非常方便,因为我将旧文件从 Dropbox Business 归档到 S3 Glacier Deep Archive 以保存在 Dropbox 上成本。
我可以轻松地从 Dropbox 创建文件传输(每个文件限制 100GB),复制下载链接并使用 rclone 直接上传到 S3。
它在小型 DigitalOcean 液滴上以 10-12 MiB/s 的速度进行复制。
You can use rclone to achieve this easily:
https://rclone.org/commands/rclone_copyurl/
Create a new access key on AWS for rclone and use rclone config like this:
https://rclone.org/s3/
Then, you can easily interact with your S3 buckets using rclone.
To upload from URL:
rclone -Pva copy {URL} RCLONE_CONFIG_NAME:/{BUCKET_NAME}/{FOLDER}/
It is quite handy for me as I am archiving my old files from Dropbox Business to S3 Glacier Deep Archive to save on Dropbox costs.
I can easily create a file transfer from Dropbox (100GB per file limit), copy the download link and upload directly to S3 using rclone.
It is copying at 10-12 MiB/s on a small DigitalOcean droplet.
如果可以的话,您可以使用 Cloudinary 作为 S3 的替代品。他们支持通过 URL 等远程上传。
https://cloudinary.com/documentation/image_upload_api_reference#upload_examples
If you are able you can use Cloudinary as an alternative to S3. They support remote upload via URL and more.
https://cloudinary.com/documentation/image_upload_api_reference#upload_examples
我使用这个 5 年前快速编写的 Python AWS lambda 函数:
I use this Python AWS lambda function quickly written 5 years ago :
我发现这篇文章有一些细节。您可能必须以某种方式修改存储桶的安全设置才能允许这种类型的交互。
http://aws.amazon.com/articles/1434
将会出现一些安全问题客户端也是如此,因为你永远不希望你的密钥被公开访问
I found this article with some details. You will probably have to modify your buckets' security settings in some fashion to allow this type of interaction.
http://aws.amazon.com/articles/1434
There will be some security issues on the client as well since you never want your keys publicly accessible