是否可以使用 POST 从 URL 直接上传到 S3?

发布于 2024-12-07 01:26:42 字数 256 浏览 1 评论 0原文

我知道有一种方法可以使用 POST 直接从 Web 浏览器上传到 S3,而无需将文件发送到后端服务器。但是有没有一种方法可以通过 URL 而不是 Web 浏览器来完成此操作。

例如,使用 post 将位于 http://example.com/dude.jpg 的文件直接上传到 S3。我的意思是我不想将资产下载到我的服务器然后将其上传到 S3。我只想向 S3 发出 POST 请求,它会自动上传。

I know there is a way to upload to S3 directly from the web browser using POST without the files going to your backend server. But is there a way to do it from URL instead of web browser.

Example, upload a file that resides at http://example.com/dude.jpg directly to S3 using post. I mean I don't want to download the asset to my server then upload it to S3. I just want to make a POST request to S3 and it uploads it automatically.

如果你对这篇内容有疑问,欢迎到本站社区发帖提问 参与讨论,获取更多帮助,或者扫码二维码加入 Web 技术交流群。

扫码二维码加入Web技术交流群

发布评论

需要 登录 才能够评论, 你可以免费 注册 一个本站的账号。

评论(6

琉璃繁缕 2024-12-14 01:26:42

听起来您希望 S3 本身从远程服务器下载文件,您只需将资源的 URL 传递给 S3。

S3 目前不支持此功能。

它需要一个API客户端来将对象的内容实际传输到S3。

It sounds like you want S3 itself to download the file from a remote server where you only pass the URL of the resource to S3.

This is not currently supported by S3.

It needs an API client to actually transfer the content of the object to S3.

悍妇囚夫 2024-12-14 01:26:42

我想我应该分享我的代码来实现类似的目标。我正在后端工作,但可能可以在前端做类似的事情,但要注意AWS凭证可能暴露

出于我的目的,我想从外部 URL 下载文件,然后最终取回上传文件的 S3 形式的 URL。

我还使用 axios 来获取可上传格式和 file-type 获取文件的正确类型,但这是不是要求。

下面是我的代码片段:

async function uploadAttachmentToS3(type, buffer) {
  var params = {
   //file name you can get from URL or in any other way, you could then pass it as parameter to the function for example if necessary
    Key : 'yourfolder/directory/filename', 
    Body : buffer,
    Bucket : BUCKET_NAME,
    ContentType : type,
    ACL: 'public-read' //becomes a public URL
  }
  //notice use of the upload function, not the putObject function
  return s3.upload(params).promise().then((response) => {
    return response.Location
  }, (err) => {
    return {type: 'error', err: err}
  })
}

async function downloadAttachment(url) {
  return axios.get(url, {
    responseType: 'arraybuffer'
  })
  .then(response => {
    const buffer = Buffer.from(response.data, 'base64');
    return (async () => {
      let type = (await FileType.fromBuffer(buffer)).mime
      return uploadAttachmentToS3(type, buffer)
    })();
  })
  .catch(err => {
    return {type: 'error', err: err}
  });  
}

let myS3Url = await downloadAttachment(url)

我希望它可以帮助那些仍然遇到类似问题的人。祝你好运!

I thought I should share my code to achieve something similar. I was working on the backend but possibly could do something similar in frontend though be mindful about AWS credentials likely to be exposed.

For my purposes, I wanted to download a file from the external URL and then ultimately get back the URL form S3 of the uploaded file instead.

I also used axios in order to get the uploadable format and file-type to get the proper type of the file but that is not the requirement.

Below is the snippet of my code:

async function uploadAttachmentToS3(type, buffer) {
  var params = {
   //file name you can get from URL or in any other way, you could then pass it as parameter to the function for example if necessary
    Key : 'yourfolder/directory/filename', 
    Body : buffer,
    Bucket : BUCKET_NAME,
    ContentType : type,
    ACL: 'public-read' //becomes a public URL
  }
  //notice use of the upload function, not the putObject function
  return s3.upload(params).promise().then((response) => {
    return response.Location
  }, (err) => {
    return {type: 'error', err: err}
  })
}

async function downloadAttachment(url) {
  return axios.get(url, {
    responseType: 'arraybuffer'
  })
  .then(response => {
    const buffer = Buffer.from(response.data, 'base64');
    return (async () => {
      let type = (await FileType.fromBuffer(buffer)).mime
      return uploadAttachmentToS3(type, buffer)
    })();
  })
  .catch(err => {
    return {type: 'error', err: err}
  });  
}

let myS3Url = await downloadAttachment(url)

I hope it helps people who still struggle with similar issues. Good luck!

流年里的时光 2024-12-14 01:26:42

您可以使用 rclone 轻松实现此目的:
https://rclone.org/commands/rclone_copyurl/

在 AWS 上为 rclone 创建新的访问密钥并使用 rclone 配置,如下所示:
https://rclone.org/s3/

然后,您可以使用 rclone 轻松与 S3 存储桶进行交互。

从 URL 上传:
rclone -Pva copy {URL} RCLONE_CONFIG_NAME:/{BUCKET_NAME}/{FOLDER}/

这对我来说非常方便,因为我将旧文件从 Dropbox Business 归档到 S3 Glacier Deep Archive 以保存在 Dropbox 上成本。

我可以轻松地从 Dropbox 创建文件传输(每个文件限制 100GB),复制下载链接并使用 rclone 直接上传到 S3。

它在小型 DigitalOcean 液滴上以 10-12 MiB/s 的速度进行复制。

You can use rclone to achieve this easily:
https://rclone.org/commands/rclone_copyurl/

Create a new access key on AWS for rclone and use rclone config like this:
https://rclone.org/s3/

Then, you can easily interact with your S3 buckets using rclone.

To upload from URL:
rclone -Pva copy {URL} RCLONE_CONFIG_NAME:/{BUCKET_NAME}/{FOLDER}/

It is quite handy for me as I am archiving my old files from Dropbox Business to S3 Glacier Deep Archive to save on Dropbox costs.

I can easily create a file transfer from Dropbox (100GB per file limit), copy the download link and upload directly to S3 using rclone.

It is copying at 10-12 MiB/s on a small DigitalOcean droplet.

没有你我更好 2024-12-14 01:26:42

如果可以的话,您可以使用 Cloudinary 作为 S3 的替代品。他们支持通过 URL 等远程上传。

https://cloudinary.com/documentation/image_upload_api_reference#upload_examples

If you are able you can use Cloudinary as an alternative to S3. They support remote upload via URL and more.

https://cloudinary.com/documentation/image_upload_api_reference#upload_examples

瀞厅☆埖开 2024-12-14 01:26:42

我使用这个 5 年前快速编写的 Python AWS lambda 函数:

import boto3
import botocore.vendored.requests.packages.urllib3 as urllib3

def lambda_handler(event, context):

    url= event['url']
    bucket = 'your-bucket'
    key = event['filename']

    s3=boto3.client('s3')
    http=urllib3.PoolManager()
    s3.upload_fileobj(http.request('GET', url,preload_content=False), bucket, key)

I use this Python AWS lambda function quickly written 5 years ago :

import boto3
import botocore.vendored.requests.packages.urllib3 as urllib3

def lambda_handler(event, context):

    url= event['url']
    bucket = 'your-bucket'
    key = event['filename']

    s3=boto3.client('s3')
    http=urllib3.PoolManager()
    s3.upload_fileobj(http.request('GET', url,preload_content=False), bucket, key)
も星光 2024-12-14 01:26:42

我发现这篇文章有一些细节。您可能必须以某种方式修改存储桶的安全设置才能允许这种类型的交互。

http://aws.amazon.com/articles/1434

将会出现一些安全问题客户端也是如此,因为你永远不希望你的密钥被公开访问

I found this article with some details. You will probably have to modify your buckets' security settings in some fashion to allow this type of interaction.

http://aws.amazon.com/articles/1434

There will be some security issues on the client as well since you never want your keys publicly accessible

~没有更多了~
我们使用 Cookies 和其他技术来定制您的体验包括您的登录状态等。通过阅读我们的 隐私政策 了解更多相关信息。 单击 接受 或继续使用网站,即表示您同意使用 Cookies 和您的相关数据。
原文