如何将多个文件(50K+)/文件夹上传到AWS S3 Node.js
我有一个在Windows计算机上运行的节点JS API,该节点会生成一些XML文件,这些文件后来将其上传到S3桶。文件数量超过50k,有时甚至更多。
在我当前的方法中,我正在使用 aws-sdk 用于上传的软件包。基本上,我循环浏览需要上传的文件夹,读取每个文件并上传。
const files = fs.readdirSync(dirPath, {
withFileTypes: true
});
for (const file of files) {
const fileContent = fs.readFileSync(path.join(dirPath, file.name));
const params = {
Bucket: BUCKET_NAME,
Key: `${folderPath}/${file.name}`,
Body: fileContent
};
try {
await s3.upload(params).promise()
} catch (err) {
//error handling
return;
}
}
这大约需要3-4个小时才能上传。有什么更好的方法可以批量上传文件吗?还是有什么方法可以上传整个文件夹?
提前致谢
I've a node js API running on a windows machine which generates some XML files which are later uploaded to S3 bucket. The number of files exceed 50k and sometimes even more.
In my current approach, I'm using aws-sdk package for uploading. Basically I loop through the folder that needs to be uploaded, read every file and upload it.
const files = fs.readdirSync(dirPath, {
withFileTypes: true
});
for (const file of files) {
const fileContent = fs.readFileSync(path.join(dirPath, file.name));
const params = {
Bucket: BUCKET_NAME,
Key: `${folderPath}/${file.name}`,
Body: fileContent
};
try {
await s3.upload(params).promise()
} catch (err) {
//error handling
return;
}
}
This takes around 3-4 hours to upload. Is there any better way to bulk upload files? Or if any way to upload entire folder?
Thanks in advance
如果你对这篇内容有疑问,欢迎到本站社区发帖提问 参与讨论,获取更多帮助,或者扫码二维码加入 Web 技术交流群。

绑定邮箱获取回复消息
由于您还没有绑定你的真实邮箱,如果其他用户或者作者回复了您的评论,将不能在第一时间通知您!
发布评论
评论(1)
我建议您首先将文件夹汇总,然后将Zipped文件夹上传到S3。在BASH脚本中,您可以这样做:
然后您可以将S3上传到:
希望有帮助!
I would recommend you zip your folder first, and then upload the zipped folder to the S3. In bash script you can do that as:
And then you can upload to S3 as:
Hope that helps!