将文件直接从一个 S3 帐户移动到另一个帐户?
非常基本的问题,但我一直无法找到答案。使用 Transit,我可以将文件从一个 AWS 账户上的一个 S3 存储桶“移动”到另一个 AWS 账户上的另一个 S3 存储桶,但它实际上所做的是从第一个账户下载文件,然后将它们上传到第二个账户。
有没有一种方法可以将文件直接从一个 S3 帐户移动到另一个帐户,而无需在两者之间下载它们?
如果你对这篇内容有疑问,欢迎到本站社区发帖提问 参与讨论,获取更多帮助,或者扫码二维码加入 Web 技术交流群。

绑定邮箱获取回复消息
由于您还没有绑定你的真实邮箱,如果其他用户或者作者回复了您的评论,将不能在第一时间通知您!
发布评论
评论(12)
将 S3 文件从一个帐户移动到另一个帐户
让我们考虑有两个帐户源帐户和目标帐户。还有两个存储桶
源存储桶
和目标存储桶
。我们希望将所有文件从source-bucket
移动到destination-bucket
。我们可以通过以下步骤来完成:aws configure
aws s3 ls s3://source-bucket/
您也可以使用同步命令
-
aws s3sync s3://source-bucket s3://detination-bucket
为了更好的解释,请遵循链接
Move S3 files from One account to another account
Let's consider there are two accounts source account and destination account. And two buckets
source-bucket
anddestination bucket
. We want to move all files fromsource-bucket
todestination-bucket
. We can do it by the following steps:aws configure
aws s3 ls s3://source-bucket/
aws s3 cp s3://source-bucket s3://destination-bucket --recursive
aws s3 mv s3://source-bucket s3://destination-bucket --recursive
Alternative you can use the sync command
-
aws s3 sync s3://source-bucket s3://detination-bucket
For Better Explanation follow the link
在 Mac OS X 上,我使用了 Panic 的 Transmit 应用程序。我为每个 S3 帐户打开一个窗口(使用 API 密钥和机密)。然后,我可以从一个窗口中的一个存储桶拖动到另一个窗口中的另一个存储桶。
无需先下载文件到本地。安德鲁是正确的,传输在本地下载文件然后上传文件。
On Mac OS X I used the Transmit app from Panic. I opened one window for each S3 account (using the API Keys and secrets). I could then drag from one bucket in one window to another bucket in the other window.
No need to download files locally first.Andrew is correct, Transmit downloads the files locally then uploads the files.
CrossFTP 可以将 S3 文件直接从一个存储桶复制到另一个存储桶,而无需下载它们。它是一个 GUI S3 客户端,可在 Windows、Mac 和 Linux 上运行。
CrossFTP can copy S3 files straight from one bucket to another without downloading them. It is a GUI S3 client that works on Windows, Mac, and Linux.
您可以使用 Cyberduck(开源)
You can user Cyberduck (open source)
对于新创建的文件(不是现有对象),您可以利用 AWS 的新功能。它是跨区域复制(在 S3 存储桶的“版本控制”下)。您可以创建一个策略,允许您将新对象复制到不同账户中的存储桶。
对于现有对象,您仍然需要使用其他方法复制对象 - 除非 AWS 将来为此引入本机功能。
For newly created files (NOT existing objects), you can take advantage of new functionality from AWS. It is Cross-Region Replication (under "Versioning" for the S3 bucket). You can create a policy that will allow you to replicate new objects to a bucket in a different account.
For existing objects, you will still need to copy your objects using another method - unless AWS introduces native functionality for this in the future.
是的,您可以将整个 s3 存储桶从您的根账户转移到另一个 AWS 根账户。
我已经尝试过给定的选项,但它们对我不起作用,即使我从博客中探索解决方案,但这对我也不起作用。因此,我开始探索 s3 存储桶中的属性和权限选项卡。
最后,我找到了一种非常容易实现的解决方案,我们不需要创建任何 IAM 角色或任何策略。只需按照给定的步骤操作即可。
先决条件:
步骤:
此命令将执行复制和粘贴操作,但如果您想移动,则可以使用
mv
而不是上面命令中的cp
在这里,您可以将 source-bucket 替换为您要复制和替换的实际存储桶名称 destination-bucket 包含您要复制到的实际存储桶名称。
您还可以指定源和目标区域名称
您可以使用您的计算机来执行此操作,也可以启动一个 EC2 实例并传输您的 s3 数据。
Yes, you can transfer the whole s3 bucket from your root account to another AWS root account.
I have tried the given options but they didn't work for me, even I explore solutions from blogs, but that also didn't work for me. So I started exploring properties and the permission tab in the s3 bucket.
And at last, I find one solution which is very easy to achieve and we do not need to create any IAM role or any policy. Just follow the given steps.
Prerequisites:
Steps:
This command will do copy and paste operation but if you want to move then you can use
mv
instead ofcp
in above commandHere you can replace source-bucket with your actual bucket name from where you want to copy and replace destination-bucket with your actual bucket name where you want to copy.
You can also specify source and destination region name
You can use your machine to do this or you can spin up one ec2 instance and transfer your s3 data.
可以通过运行以下命令来实现:
aws s3 mv(用于保持存储桶同步的同步)s3://source-bucket s3://destination-bucket --recursive
附加存储桶策略到源账户中的源存储桶。
将 AWS Identity and Access Management (IAM) 策略附加到目标账户中的用户或角色。
使用目标账户中的 IAM 用户或角色执行跨账户移动。
One can so it with running following :
aws s3 mv (sync for keeping buckets in sync) s3://source-bucket s3://destination-bucket --recursive
Attach a bucket policy to the source bucket in Source Account.
Attach an AWS Identity and Access Management (IAM) policy to a user or role in Destination Account.
Use the IAM user or role in Destination Account to perform the cross-account move.
到目前为止给出的答案都需要一个可以访问源和目标 s3 存储桶的帐户。我最近发现自己处于不允许这样做的情况(出于各种非技术公司原因,我们假设是好的)。
我最终采用的解决方案是:
s3fs
挂载目标文件夹(/mnt/target)aws 同步s3://source_bucket/folder /mnt/target/folder ...
(或根据需要mv
或cp
)这是我最简单的方法当不允许单个 IAM 角色同时拥有这两个角色的权限,以及禁止使用中间位置时,可以在文件夹之间进行复制。
The given answers so far all require an account that has access to both the source and target s3 buckets. I've found myself recently in a situation where this was not allowed (for various non-technical company reasons that we'll just assume were good).
The solution I ended up going with was to:
s3fs
somewhere (/mnt/target)aws sync s3://source_bucket/folder /mnt/target/folder ...
(Ormv
orcp
as needed)This is the easiest way I've seen to copy between folders when it's not allowed to have a single IAM role with permission to both, and when it's prohibitive to use an intermediate location.
是的,有办法。而且它非常简单,尽管很难找到。 8)
例如,假设您的第一个帐户用户名是 [email protected] 并且第二个是 [电子邮件受保护]。
以 acc1 身份打开 AWS 管理控制台。进入 Amazon S3 存储桶属性,然后在“权限”选项卡中单击“添加更多权限”。然后添加“经过身份验证的用户”的列表和查看权限。
接下来,在 acc2 的 AWS IAM(可从控制台选项卡中访问)中创建一个对 S3 存储桶具有完全访问权限的用户(为了更安全,您可以设置确切的权限,但我更喜欢为传输创建一个临时用户然后将其删除)。
然后您可以使用 s3cmd (使用 acc2 中新创建的用户的凭据)执行以下操作
:由亚马逊这边完成。
Yes, there is a way. And its pretty simple, though it's hard to find it. 8)
For example, suppose your first account username is [email protected] and second is [email protected].
Open AWS Management Console as acc1. Get to the Amazon S3 bucket properties, and in the "Permissions" tab click "Add more permissions". Then add List and View Permissions for "Authenticated Users".
Next, in AWS IAM (it's accessible from among the console tabs) of acc2 create a user with full access to the S3 bucket (to be more secure, you can set up exact permissions, but I prefer to create a temporary user for the transfer and then delete it).
Then you can use s3cmd (using the credentials of the newly created user in acc2) to do something like:
All transfer will be done on Amazon's side.
使用 aws cli (我使用 ubuntu 14 ec2实例),只需运行以下命令:
您需要指定其中一个的帐户详细信息,并对另一个具有公共写入访问权限或公共读取访问权限。
这将同步两个存储桶。您可以稍后再次使用相同的命令来快速同步。最好的部分是它似乎不需要任何带宽(例如文件不通过本地计算机传递)。
Use the aws cli (I used ubuntu 14 ec2 instance) and just run the following command:
You will need to specify the account details for one, and have public write access or public read access to the other.
This will sync the two buckets. You can use the same command again later to sync quickly. Best part is that it doesn't seem t require any bandwidth (e.g. files are not passing through local computer).
如果您只是在寻找现成的解决方案,那么有一些解决方案可以做到这一点。 Bucket Explorer 适用于 Mac 和 Windows,并且可以跨帐户复制 可以 Cloudberry S3 Explorer 和 S3 浏览器,但它们仅适用于 Windows,因此可能不适合您。
我怀疑 AWS 控制台 也可以通过适当的权限设置来完成此操作,但我尚未对此进行测试。
您还可以使用 AWS API 执行此操作,只要正如您所提供的 AWS 帐户所使用的 write目标存储桶的权限。
If you are just looking for a ready made solution there are a few solutions out there that can do this. Bucket Explorer works on Mac and Windows and can copy across accounts as can Cloudberry S3 Explorer and S3 Browser but they are Windows only so may not work for you.
I suspect the AWS console could also do it with the appropriate permissions setup but I haven't tested this.
You can also do it using the AWS API as long as you have given the AWS account you are using write permissions to the destination bucket.
boto 效果很好。请参阅此线程。使用 boto,您可以将对象直接从一个存储桶复制到另一个存储桶,而不是将它们下载到本地计算机然后上传到另一个存储桶。
boto works well. See this thread. Using boto, you copy objects straight from one bucket to another, rather than downloading them to the local machine and uploading them to another bucket.