将文件直接从一个 S3 帐户移动到另一个帐户?

发布于 2024-10-29 09:18:08 字数 180 浏览 8 评论 0 原文

非常基本的问题,但我一直无法找到答案。使用 Transit,我可以将文件从一个 AWS 账户上的一个 S3 存储桶“移动”到另一个 AWS 账户上的另一个 S3 存储桶,但它实际上所做的是从第一个账户下载文件,然后将它们上传到第二个账户。

有没有一种方法可以将文件直接从一个 S3 帐户移动到另一个帐户,而无需在两者之间下载它们?

Pretty basic question but I haven't been able to find an answer. Using Transit I can "move" files from one S3 bucket on one AWS account to another S3 bucket on another AWS account, but what it actually does is download the files from the first then upload them to the second.

Is there a way to move files directly from one S3 account to another without downloading them in between?

如果你对这篇内容有疑问,欢迎到本站社区发帖提问 参与讨论,获取更多帮助,或者扫码二维码加入 Web 技术交流群。

扫码二维码加入Web技术交流群

发布评论

需要 登录 才能够评论, 你可以免费 注册 一个本站的账号。

评论(12

标点 2024-11-05 09:18:09

将 S3 文件从一个帐户移动到另一个帐户

让我们考虑有两个帐户源帐户和目标帐户。还有两个存储桶源存储桶目标存储桶。我们希望将所有文件从 source-bucket 移动到 destination-bucket。我们可以通过以下步骤来完成:

  1. aws configure
    • 使用凭据或 IAM 角色配置您的目标账户。
  2. 为目标帐户用户创建用户策略。
  3. 通过修改源存储桶策略并向其中添加目标帐户用户策略,授予目标用户对源存储桶的访问权限。这样,目标用户就可以访问源桶了。
  4. aws s3 ls s3://source-bucket/
    • 这将检查目标帐户是否有权访问源存储桶。这样做只是为了确认。
  5. AWS s3 cp s3://源存储桶 s3://目标存储桶 --recursive
    • 这会将源存储桶中的所有文件复制到目标存储桶中。所有文件都使用 --recursive 标志复制。
  6. AWS s3 mv s3://源存储桶 s3://目标存储桶 --recursive
    • 这会将所有文件从源存储桶移动到目标存储桶。

您也可以使用同步命令
- aws s3sync s3://source-bucket s3://detination-bucket

为了更好的解释,请遵循链接

Move S3 files from One account to another account

Let's consider there are two accounts source account and destination account. And two buckets source-bucket and destination bucket. We want to move all files from source-bucket to destination-bucket. We can do it by the following steps:

  1. aws configure
    • Configure your destination account using the credential or the IAM role.
  2. Create user policy for the destination account user.
  3. Give destination user access to the source-bucket by modifying the source-bucket policy and adding destination account user policy into it. By this way, destination user will have the access to source-bucket.
  4. aws s3 ls s3://source-bucket/
    • this will check whether the destination account is having access to source-bucket. Just for confirmation do this.
  5. aws s3 cp s3://source-bucket s3://destination-bucket --recursive
    • this will copy source-bucket all files to destination-bucket. All files are copied using --recursive flag.
  6. aws s3 mv s3://source-bucket s3://destination-bucket --recursive
    • this will move all the files from source-bucket to destination-bucket.

Alternative you can use the sync command
- aws s3 sync s3://source-bucket s3://detination-bucket

For Better Explanation follow the link

幻梦 2024-11-05 09:18:09

在 Mac OS X 上,我使用了 Panic 的 Transmit 应用程序。我为每个 S3 帐户打开一个窗口(使用 API 密钥和机密)。然后,我可以从一个窗口中的一个存储桶拖动到另一个窗口中的另一个存储桶。 无需先下载文件到本地

安德鲁是正确的,传输在本地下载文件然后上传文件。

On Mac OS X I used the Transmit app from Panic. I opened one window for each S3 account (using the API Keys and secrets). I could then drag from one bucket in one window to another bucket in the other window. No need to download files locally first.

Andrew is correct, Transmit downloads the files locally then uploads the files.

老娘不死你永远是小三 2024-11-05 09:18:09

CrossFTP 可以将 S3 文件直接从一个存储桶复制到另一个存储桶,而无需下载它们。它是一个 GUI S3 客户端,可在 Windows、Mac 和 Linux 上运行。

CrossFTP can copy S3 files straight from one bucket to another without downloading them. It is a GUI S3 client that works on Windows, Mac, and Linux.

妞丶爷亲个 2024-11-05 09:18:09

您可以使用 Cyber​​duck(开源)

You can user Cyberduck (open source)

等你爱我 2024-11-05 09:18:09

对于新创建的文件(不是现有对象),您可以利用 AWS 的新功能。它是跨区域复制(在 S3 存储桶的“版本控制”下)。您可以创建一个策略,允许您将新对象复制到不同账户中的存储桶。

对于现有对象,您仍然需要使用其他方法复制对象 - 除非 AWS 将来为此引入本机功能。

For newly created files (NOT existing objects), you can take advantage of new functionality from AWS. It is Cross-Region Replication (under "Versioning" for the S3 bucket). You can create a policy that will allow you to replicate new objects to a bucket in a different account.

For existing objects, you will still need to copy your objects using another method - unless AWS introduces native functionality for this in the future.

凉宸 2024-11-05 09:18:09

是的,您可以将整个 s3 存储桶从您的根账户转移到另一个 AWS 根账户。

我已经尝试过给定的选项,但它们对我不起作用,即使我从博客中探索解决方案,但这对我也不起作用。因此,我开始探索 s3 存储桶中的属性和权限选项卡。

最后,我找到了一种非常容易实现的解决方案,我们不需要创建任何 IAM 角色或任何策略。只需按照给定的步骤操作即可。

先决条件

  • AWS cli 已安装并配置
  • 在源帐户和目标帐户上创建的 S3 存储桶

步骤

  • 导航到目标 s3 存储桶并单击 >权限选项卡
  • 向下滚动到访问控制列表 (ACL),然后单击编辑按钮
  • 向下滚动到其他 AWS 账户的访问权限然后单击添加受让人按钮
  • 在文本框中输入您的规范 ID,然后检查对象已读取 write 权限框
  • 您可以通过单击 aws 账户名称窗口右上角的 -> 来获取规范 ID。点击安全凭证->在该页面上,您可以复制您的规范 ID。
  • 添加受让人后,单击保存更改按钮
  • 现在打开您的终端/cmd 并触发以下命令

aws s3 cp --recursive s3://源存储桶 s3://目标存储桶
--source-region 源区域 --region 目标区域 --acl 存储桶所有者完全控制

此命令将执行复制和粘贴操作,但如果您想移动,则可以使用 mv而不是上面命令中的 cp

在这里,您可以将 source-bucket 替换为您要复制和替换的实际存储桶名称 destination-bucket 包含您要复制到的实际存储桶名称。

您还可以指定目标区域名称

您可以使用您的计算机来执行此操作,也可以启动一个 EC2 实例并传输您的 s3 数据。

Yes, you can transfer the whole s3 bucket from your root account to another AWS root account.

I have tried the given options but they didn't work for me, even I explore solutions from blogs, but that also didn't work for me. So I started exploring properties and the permission tab in the s3 bucket.

And at last, I find one solution which is very easy to achieve and we do not need to create any IAM role or any policy. Just follow the given steps.

Prerequisites:

  • AWS cli Installed and configured
  • S3 bucket created on both source and destination account

Steps:

  • Navigate to the destination s3 bucket and click on permission tab
  • Scroll down to Access Control List (ACL) and click on Edit button
  • Scroll down to Access for other AWS accounts and click on Add grantee button
  • Enter your canonical ID in the textbox and check the object read and write permission box
  • You can get the canonical ID by clicking on the top right corner of the window at your aws account name -> click on security credentials -> On that page, you can copy your canonical id.
  • After adding the grantee click on the Save Changes button
  • Now open your terminal/cmd and fire below command

aws s3 cp --recursive s3://source-bucket s3://destination-bucket
--source-region source-region --region destination-region --acl bucket-owner-full-control

This command will do copy and paste operation but if you want to move then you can use mv instead of cp in above command

Here you can replace source-bucket with your actual bucket name from where you want to copy and replace destination-bucket with your actual bucket name where you want to copy.

You can also specify source and destination region name

You can use your machine to do this or you can spin up one ec2 instance and transfer your s3 data.

南汐寒笙箫 2024-11-05 09:18:09

可以通过运行以下命令来实现:

aws s3 mv(用于保持存储桶同步的同步)s3://source-bucket s3://destination-bucket --recursive

  1. 附加存储桶策略到源账户中的源存储桶。

  2. 将 AWS Identity and Access Management (IAM) 策略附加到目标账户中的用户或角色。

  3. 使用目标账户中的 IAM 用户或角色执行跨账户移动。

One can so it with running following :

aws s3 mv (sync for keeping buckets in sync) s3://source-bucket s3://destination-bucket --recursive

  1. Attach a bucket policy to the source bucket in Source Account.

  2. Attach an AWS Identity and Access Management (IAM) policy to a user or role in Destination Account.

  3. Use the IAM user or role in Destination Account to perform the cross-account move.

記憶穿過時間隧道 2024-11-05 09:18:09

到目前为止给出的答案都需要一个可以访问源和目标 s3 存储桶的帐户。我最近发现自己处于不允许这样做的情况(出于各种非技术公司原因,我们假设是好的)。

我最终采用的解决方案是:

  1. 启动一个 EC2 实例,并授予写入目标存储桶的权限(您可以在本地执行此操作,但 AWS 的带宽和网络 I/O 使 EC2 值得 - 任何小型实例都可以) )
  2. 使用s3fs 挂载目标文件夹(/mnt/target)
  3. 向我的命令行提供对存储桶的读取访问权限(通过 AWS_ACCESS_KEY_ID 等)
  4. 使用 aws 同步s3://source_bucket/folder /mnt/target/folder ... (或根据需要 mvcp

这是我最简单的方法当不允许单个 IAM 角色同时拥有这两个角色的权限,以及禁止使用中间位置时,可以在文件夹之间进行复制。

The given answers so far all require an account that has access to both the source and target s3 buckets. I've found myself recently in a situation where this was not allowed (for various non-technical company reasons that we'll just assume were good).

The solution I ended up going with was to:

  1. Spin up an EC2 instance with permission to write to the target bucket (you can do this locally but the bandwidth and network i/o out of AWS makes EC2 worth it - any tiny instance will do)
  2. Mount the destination folder with s3fs somewhere (/mnt/target)
  3. Give my command line read access (via AWS_ACCESS_KEY_ID, etc) to the source bucket
  4. Use aws sync s3://source_bucket/folder /mnt/target/folder ... (Or mv or cp as needed)

This is the easiest way I've seen to copy between folders when it's not allowed to have a single IAM role with permission to both, and when it's prohibitive to use an intermediate location.

大姐,你呐 2024-11-05 09:18:08

是的,有办法。而且它非常简单,尽管很难找到。 8)

例如,假设您的第一个帐户用户名是 [email protected] 并且第二个是 [电子邮件受保护]

以 acc1 身份打开 AWS 管理控制台。进入 Amazon S3 存储桶属性,然后在“权限”选项卡中单击“添加更多权限”。然后添加“经过身份验证的用户”的列表和查看权限。

接下来,在 acc2 的 AWS IAM(可从控制台选项卡中访问)中创建一个对 S3 存储桶具有完全访问权限的用户(为了更安全,您可以设置确切的权限,但我更喜欢为传输创建一个临时用户然后将其删除)。

然后您可以使用 s3cmd (使用 acc2 中新创建的用户的凭据)执行以下操作

s3cmd cp s3://acc1_bucket/folder/ s3://acc2_bucket/folder --recursive

:由亚马逊这边完成。

Yes, there is a way. And its pretty simple, though it's hard to find it. 8)

For example, suppose your first account username is [email protected] and second is [email protected].

Open AWS Management Console as acc1. Get to the Amazon S3 bucket properties, and in the "Permissions" tab click "Add more permissions". Then add List and View Permissions for "Authenticated Users".

Next, in AWS IAM (it's accessible from among the console tabs) of acc2 create a user with full access to the S3 bucket (to be more secure, you can set up exact permissions, but I prefer to create a temporary user for the transfer and then delete it).

Then you can use s3cmd (using the credentials of the newly created user in acc2) to do something like:

s3cmd cp s3://acc1_bucket/folder/ s3://acc2_bucket/folder --recursive

All transfer will be done on Amazon's side.

旧街凉风 2024-11-05 09:18:08

使用 aws cli (我使用 ubuntu 14 ec2实例),只需运行以下命令:

aws s3 sync s3://bucket1 s3://bucket2

您需要指定其中一个的帐户详细信息,并对另一个具有公共写入访问权限或公共读取访问权限。

这将同步两个存储桶。您可以稍后再次使用相同的命令来快速同步。最好的部分是它似乎不需要任何带宽(例如文件不通过本地计算机传递)。

Use the aws cli (I used ubuntu 14 ec2 instance) and just run the following command:

aws s3 sync s3://bucket1 s3://bucket2

You will need to specify the account details for one, and have public write access or public read access to the other.

This will sync the two buckets. You can use the same command again later to sync quickly. Best part is that it doesn't seem t require any bandwidth (e.g. files are not passing through local computer).

暮凉 2024-11-05 09:18:08

如果您只是在寻找现成的解决方案,那么有一些解决方案可以做到这一点。 Bucket Explorer 适用于 Mac 和 Windows,并且可以跨帐户复制 可以 Cloudberry S3 ExplorerS3 浏览器,但它们仅适用于 Windows,因此可能不适合您。

我怀疑 AWS 控制台 也可以通过适当的权限设置来完成此操作,但我尚未对此进行测试。

您还可以使用 AWS API 执行此操作,只要正如您所提供的 AWS 帐户所使用的 write目标存储桶的权限

If you are just looking for a ready made solution there are a few solutions out there that can do this. Bucket Explorer works on Mac and Windows and can copy across accounts as can Cloudberry S3 Explorer and S3 Browser but they are Windows only so may not work for you.

I suspect the AWS console could also do it with the appropriate permissions setup but I haven't tested this.

You can also do it using the AWS API as long as you have given the AWS account you are using write permissions to the destination bucket.

音盲 2024-11-05 09:18:08

boto 效果很好。请参阅此线程。使用 boto,您可以将对象直接从一个存储桶复制到另一个存储桶,而不是将它们下载到本地计算机然后上传到另一个存储桶。

boto works well. See this thread. Using boto, you copy objects straight from one bucket to another, rather than downloading them to the local machine and uploading them to another bucket.

~没有更多了~
我们使用 Cookies 和其他技术来定制您的体验包括您的登录状态等。通过阅读我们的 隐私政策 了解更多相关信息。 单击 接受 或继续使用网站,即表示您同意使用 Cookies 和您的相关数据。
原文