6xs 中文文档教程

发布于 8年前 浏览 32 项目主页 更新于 3年前

6xs

npm 版本 构建状态 代码覆盖率npm 下载 依赖状态 devDependency Status

6xs 表示简单存储服务静态站点同步。

它获取你的 /public 目录(或者你怎么称呼它)并推送它的 内容(可选地与 node-glob 匹配)到选定的 S3 存储桶中。

它还可以:

  • remove remote files that are not found in your local directory
  • create an invalidation for a chosen CloudFront distribution

Usage

这个使用示例展示了所有可用的配置选项:

var sync = require('6xs');
var path = require('path');

sync({
  // defaults to: process.cwd() + '/public'
  base: path.join(__dirname, 'public'),
  // defaults to: '**'
  patterns: ['*.html', 'font/*'],
  // defaults to noop
  logger: function () {
    return console.log.apply(console, arguments);
  },
  // custom mappings between file extension and content type
  // if not provided, libmagic is used for detection
  // values below are used by default:
  contentTypeMap: {
    html: 'text/html',
    css: 'text/css',
    js: 'application/javascript',
    json: 'application/json'
  },
  aws: {
    // have to be provided:
    access_key_id: 'abcdef...',
    secret_access_key: 'xyz987...',
    // defaults:
    ssl: true,
    retries: 3,
    concurrency: 10
  },
  s3: {
    // have to be provided:
    region: 'eu-west-1',
    bucket: 'your-bucket-name',
    // defaults to false:
    remove_remote_surplus: true
    // defaults:
    max_age: 365,
    s_max_age: 1,
  },
  // if the distribution id is provided,
  // its content will be invalidated after the upload
  cf_distribution_id: 'qwerty...'
}, function (err, uploadedFiles) {
  // callback is optional
  // if upload was successful, err will be null
  // uploadedFiles is an array of paths
});

CLI usage

Usage
  $ 6xs <settings/options>

  This will upload the current working directory to the specified S3 bucket.

Required settings
  -i, --id      AWS Access Key ID
  -s, --secret  AWS Secret Access Key
  -b, --bucket  AWS S3 Bucket name
  -r, --region  AWS region

Options
  -p, --patterns     Glob patterns of the files to upload
                     default: **
                     e.g. *.html
                     e.g. *.html,fonts/*

  -ma, --max-age     Cache-Control max-age header, in days
                     default: 365

  -sa, --s-max-age   Cache-Control s-maxage header, in days
                     default: 1

  --retries          Number of retries
                     Default: 3

  --concurrency      Number of concurrent uploads
                     Default: 10

  --remove-surplus   Remove remote files that are not
                     found in your local directory

  --no-ssl           Don't use SSL

  -cf, --cloudfront  The distribution ID to invalidate

Examples
  $ 6xs -i I2B -s KPAvL4GR -b my-s3-site.gov -r us-west-2 --remove-surplus
  Uploading: ...

Contributing

热烈欢迎拉取请求和/或问题报告!

Running tests

$ npm run test
$ npm run coverage

Running integration tests locally

如果您的 PR 源自 fork,Travis build 将不会运行集成测试。

您需要提供 4 个环境变量来运行集成测试 本地。 由访问密钥标识的用户必须具有适当的 允许分配的 S3 存储桶的策略。

$ AWS_ACCESS_KEY_ID=key-id \
AWS_SECRET_ACCESS_KEY=secret \
S3_REGION=your-region \
S3_BUCKET=your-test-bucket \
npm run test-integration

如果您了解含义,您可以复制 integration-test.sh.dist 和 根据您的需要进行调整。

Contributors

License

麻省理工学院

6xs

npm versionBuild StatusCode Coveragenpm downloadsDependency StatusdevDependency Status

6xs means Simple Storage Service Static Site Sync.

It takes your /public directory (or however you call it) and pushes its contents (optionally matching with node-glob) into a selected S3 bucket.

It also can:

  • remove remote files that are not found in your local directory
  • create an invalidation for a chosen CloudFront distribution

Usage

This usage example presents all available configuration options:

var sync = require('6xs');
var path = require('path');

sync({
  // defaults to: process.cwd() + '/public'
  base: path.join(__dirname, 'public'),
  // defaults to: '**'
  patterns: ['*.html', 'font/*'],
  // defaults to noop
  logger: function () {
    return console.log.apply(console, arguments);
  },
  // custom mappings between file extension and content type
  // if not provided, libmagic is used for detection
  // values below are used by default:
  contentTypeMap: {
    html: 'text/html',
    css: 'text/css',
    js: 'application/javascript',
    json: 'application/json'
  },
  aws: {
    // have to be provided:
    access_key_id: 'abcdef...',
    secret_access_key: 'xyz987...',
    // defaults:
    ssl: true,
    retries: 3,
    concurrency: 10
  },
  s3: {
    // have to be provided:
    region: 'eu-west-1',
    bucket: 'your-bucket-name',
    // defaults to false:
    remove_remote_surplus: true
    // defaults:
    max_age: 365,
    s_max_age: 1,
  },
  // if the distribution id is provided,
  // its content will be invalidated after the upload
  cf_distribution_id: 'qwerty...'
}, function (err, uploadedFiles) {
  // callback is optional
  // if upload was successful, err will be null
  // uploadedFiles is an array of paths
});

CLI usage

Usage
  $ 6xs <settings/options>

  This will upload the current working directory to the specified S3 bucket.

Required settings
  -i, --id      AWS Access Key ID
  -s, --secret  AWS Secret Access Key
  -b, --bucket  AWS S3 Bucket name
  -r, --region  AWS region

Options
  -p, --patterns     Glob patterns of the files to upload
                     default: **
                     e.g. *.html
                     e.g. *.html,fonts/*

  -ma, --max-age     Cache-Control max-age header, in days
                     default: 365

  -sa, --s-max-age   Cache-Control s-maxage header, in days
                     default: 1

  --retries          Number of retries
                     Default: 3

  --concurrency      Number of concurrent uploads
                     Default: 10

  --remove-surplus   Remove remote files that are not
                     found in your local directory

  --no-ssl           Don't use SSL

  -cf, --cloudfront  The distribution ID to invalidate

Examples
  $ 6xs -i I2B -s KPAvL4GR -b my-s3-site.gov -r us-west-2 --remove-surplus
  Uploading: ...

Contributing

Pull requests and/or issue reports are warmly welcomed!

Running tests

$ npm run test
$ npm run coverage

Running integration tests locally

Travis build won't run integration tests if your PR originates in a fork.

You'll need to provide 4 environmental variables to run integration tests locally. The user identified by the access key has to have an appropriate allowing policy for the S3 bucket assigned.

$ AWS_ACCESS_KEY_ID=key-id \
AWS_SECRET_ACCESS_KEY=secret \
S3_REGION=your-region \
S3_BUCKET=your-test-bucket \
npm run test-integration

If you understand implications you can copy integration-test.sh.dist and adjust it to your needs.

Contributors

License

MIT

    我们使用 Cookies 和其他技术来定制您的体验包括您的登录状态等。通过阅读我们的 隐私政策 了解更多相关信息。 单击 接受 或继续使用网站,即表示您同意使用 Cookies 和您的相关数据。
    原文