如何从AWS S3下载大型文件,并重新启动网络损失
我正在尝试从AWS S3存储桶实现“网络安全”下载器。 下载器应该能够从S3下载单个.zip文件并将其写入本地.zip文件。
我当前的方法是将节点与ReadStream和Writestream Ass一起使用,这是
const download = async () => {
AWS.config.update(
{
accessKeyId: "",
secretAccessKey: "",
region: ""
}
);
const s3 = new AWS.S3();
const params = {
Bucket: '',
Key: ''
};
const { ContentLength: contentLength } = await s3.headObject(params).promise();
const rs = s3.getObject(params).createReadStream()
const ws = fs.createWriteStream(path.join('./', 'file.zip'));
let progress = 0;
rs.on('data', function (chunk) {
progress += chunk.length;
console.log(`Progress: ${progress / contentLength * 100}%`);
});
rs.pipe(ws);
}
我需要的一种方法,是一种捕获/创建有关网络错误的事件的方法,这将使我可以在网络重新开始时暂停并重新启动下载。 甚至更好 - 在恢复网络后自动重新启动下载。 目前,我找不到有关网络错误的任何事件,似乎在下载过程中的网络损失并不能触发“错误”事件。
节点/python中的任何解决方案都将非常适合
I am trying to implement a "network safe" downloader from aws s3 bucket.
the downloader should be able to download a single .zip file from s3 and write it to a local .zip file.
My current approach is using node with readStream and writeStream ass follows
const download = async () => {
AWS.config.update(
{
accessKeyId: "",
secretAccessKey: "",
region: ""
}
);
const s3 = new AWS.S3();
const params = {
Bucket: '',
Key: ''
};
const { ContentLength: contentLength } = await s3.headObject(params).promise();
const rs = s3.getObject(params).createReadStream()
const ws = fs.createWriteStream(path.join('./', 'file.zip'));
let progress = 0;
rs.on('data', function (chunk) {
progress += chunk.length;
console.log(`Progress: ${progress / contentLength * 100}%`);
});
rs.pipe(ws);
}
what i need is a way to catch/create an event regarding network errors that will allow me to pause and restart the download when network is back on.
or even better - auto restart the download when network is restored.
currently i couldn't find any events regarding network errors and seems that network loss while download is in process does not triggers the 'error' event.
any solutions in node/python will be very appriciated
如果你对这篇内容有疑问,欢迎到本站社区发帖提问 参与讨论,获取更多帮助,或者扫码二维码加入 Web 技术交流群。

绑定邮箱获取回复消息
由于您还没有绑定你的真实邮箱,如果其他用户或者作者回复了您的评论,将不能在第一时间通知您!
发布评论
评论(1)
简短更新:发现使用WGET将PresignedUrl下载到我要下载的对象上的工作。仍然不是我想拥有的本地经历。分享您的想法
Short update: found a work around using wget to download a preSignedUrl to the object i want to download. still not the native experience i wanted to have. share your thoughts