@1pete/s3-streams 中文文档教程
s3-streams
支持使用 Amazon 的本机 API 对 S3 进行流式读取和写入。
当涉及到 S3 时,亚马逊让做任何类似流的事情变得非常痛苦(考虑到每个请求都需要一个 Content-Length
标头的一般限制)。 我们提供包装 aws-sdk
S3 请求和响应的原生流类(Readable
和 Writable
),让您的生活更轻松。
重要提示:此库使用 streams3
API。 为了提供与旧版本节点的兼容性,我们使用 readable-stream。 这不太可能对您的代码产生任何影响,但尚未经过充分测试。
如果您使用的是 node 0.8
,则必须确保您的 npm
版本至少为 1.4.6
。
特点:
- Native read streams,
- Native write streams,
- Smart piping.
Usage
npm install s3-streams
Write Streams
创建用于上传到 S3 的
var S3 = require('aws-sdk').S3,
S3S = require('s3-streams');
var upload = S3S.WriteStream(new S3(), {
Bucket: 'my-bucket',
Key: 'my-key',
// Any other AWS SDK options
// ContentType: 'application/json'
// Expires: new Date('2099-01-01')
// ...
});
Read Streams
流: 创建用于从 S3 下载的流:
var S3 = require('aws-sdk').S3,
S3S = require('s3-streams');
var download = S3S.ReadStream(new S3(), {
Bucket: 'my-bucket',
Key: 'my-key',
// Any other AWS SDK options
});
Smart Piping
通过 HTTP 的
var http = require('http'),
S3 = require('aws-sdk').S3,
S3S = require('s3-streams');
http.createServer(function(req, res) {
var src = S3S.ReadStream(...);
// Automatically sets the correct HTTP headers
src.pipe(res);
})
智能管道文件: S3 上的智能管道文件:
var S3 = require('aws-sdk').S3,
S3S = require('s3-streams');
var src = S3S.ReadStream(...),
dst = S3S.WriteStream(...);
// No data ever gets downloaded locally.
src.pipe(dst);
Extras
您可以通过为您拥有的特定 S3 实例创建部分来创建具有不同设置的流:
var instance = new S3(), s3 = {
createReadStream: _.partial(S3ReadStream, instance),
createWriteStream: _.partial(S3WriteStream, instance)
}
var stream = s3.createReadStream({ Bucket: 'my-bucket', Key: 'my-key' });
现有框架:
- knox (doesn't use native AWS SDK, no true streaming support)
- s3-upload-stream (doesn't use node streams API, no support for streaming downloads)
- s3-download-stream (only does downloads, downloads are streamed by S3 part, not by individual buffer chunks)
- streaming-s3 (overall terrible API; no actual streams)
- create-s3-object-write-stream (probably one of the better ones)
s3-streams
Support for streaming reads and writes from and to S3 using Amazon's native API.
Amazon makes it a giant pain to do anything stream-like when it comes to S3 (given the general restriction that every request needs a Content-Length
header). We provide native stream classes (both Readable
and Writable
) that wrap aws-sdk
S3 requests and responses to make your life easier.
IMPORTANT: This library uses the streams3
API. In order to provide compatibility with older versions of node we make use of readable-stream. This is unlikely to have any effect on your code but has not yet been well tested.
If you are using node 0.8
you must ensure your version of npm
is at least 1.4.6
.
Features:
- Native read streams,
- Native write streams,
- Smart piping.
Usage
npm install s3-streams
Write Streams
Create streams for uploading to S3:
var S3 = require('aws-sdk').S3,
S3S = require('s3-streams');
var upload = S3S.WriteStream(new S3(), {
Bucket: 'my-bucket',
Key: 'my-key',
// Any other AWS SDK options
// ContentType: 'application/json'
// Expires: new Date('2099-01-01')
// ...
});
Read Streams
Create streams for downloading from S3:
var S3 = require('aws-sdk').S3,
S3S = require('s3-streams');
var download = S3S.ReadStream(new S3(), {
Bucket: 'my-bucket',
Key: 'my-key',
// Any other AWS SDK options
});
Smart Piping
Smart pipe files over HTTP:
var http = require('http'),
S3 = require('aws-sdk').S3,
S3S = require('s3-streams');
http.createServer(function(req, res) {
var src = S3S.ReadStream(...);
// Automatically sets the correct HTTP headers
src.pipe(res);
})
Smart pipe files on S3:
var S3 = require('aws-sdk').S3,
S3S = require('s3-streams');
var src = S3S.ReadStream(...),
dst = S3S.WriteStream(...);
// No data ever gets downloaded locally.
src.pipe(dst);
Extras
You can create streams with different settings by creating a partial for the specific S3 instance you have:
var instance = new S3(), s3 = {
createReadStream: _.partial(S3ReadStream, instance),
createWriteStream: _.partial(S3WriteStream, instance)
}
var stream = s3.createReadStream({ Bucket: 'my-bucket', Key: 'my-key' });
Existing frameworks:
- knox (doesn't use native AWS SDK, no true streaming support)
- s3-upload-stream (doesn't use node streams API, no support for streaming downloads)
- s3-download-stream (only does downloads, downloads are streamed by S3 part, not by individual buffer chunks)
- streaming-s3 (overall terrible API; no actual streams)
- create-s3-object-write-stream (probably one of the better ones)