Bitbucket 管道环境变量
我正在将 React 应用程序部署到 s3 存储桶,并且有很多环境变量,所以问题是如何处理 bitbucket 管道中的环境变量?
管道
+ npm run build
> [email protected] build /opt/atlassian/pipelines/agent/build
> env-cmd -f .env.prod react-scripts build && cp build/index.html build/200.html
Error: Failed to find .env file at path: .env.prod
at getEnvFile (/opt/atlassian/pipelines/agent/build/node_modules/env-cmd/dist/get-env-vars.js:40:19)
at process._tickCallback (internal/process/next_tick.js:68:7)
at Function.Module.runMain (internal/modules/cjs/loader.js:834:11)
at startup (internal/bootstrap/node.js:283:19)
at bootstrapNodeJSCore (internal/bootstrap/node.js:623:3)
npm ERR! code ELIFECYCLE
npm ERR! errno 1
npm ERR! [email protected] build: `env-cmd -f .env.prod react-scripts build && cp build/index.html build/200.html`
npm ERR! Exit status 1
npm ERR!
npm ERR! Failed at the [email protected] build script.
npm ERR! This is probably not a problem with npm. There is likely additional logging output above.
npm ERR! A complete log of this run can be found in:
npm ERR! /root/.npm/_logs/2022-03-01T10_17_14_844Z-debug.log
bitbucket-pipeline.yml中的错误
image: node:10
# Workflow Configuration
pipelines:
default:
- parallel:
- step:
name: Build and Test
caches:
- node
script:
- npm install
# CI=true in default variables for Bitbucket Pipelines https://support.atlassian.com/bitbucket-cloud/docs/variables-in-pipelines/
branches:
master:
- parallel:
- step:
name: Build and Test
caches:
- node
script:
- npm install
# CI=true in default variables for Bitbucket Pipelines https://support.atlassian.com/bitbucket-cloud/docs/variables-in-pipelines/
- npm run build
artifacts:
- build/**
- step:
name: Security Scan
script:
# Run a security scan for sensitive data.
# See more security tools at https://bitbucket.org/product/features/pipelines/integrations?&category=security
- pipe: atlassian/git-secrets-scan:0.5.1
- step:
name: Deploy to Production
deployment: Production
trigger: manual
clone:
enabled: false
script:
# sync your files to S3
- pipe: atlassian/aws-s3-deploy:1.1.0
variables:
AWS_ACCESS_KEY_ID: $AWS_ACCESS_KEY_ID
AWS_SECRET_ACCESS_KEY: $AWS_SECRET_ACCESS_KEY
AWS_DEFAULT_REGION: $AWS_DEFAULT_REGION
S3_BUCKET: $S3_BUCKET
LOCAL_PATH: 'build'
# triggering a distribution invalidation to refresh the CDN caches
- pipe: atlassian/aws-cloudfront-invalidate:0.6.0
variables:
AWS_ACCESS_KEY_ID: $AWS_ACCESS_KEY_ID
AWS_SECRET_ACCESS_KEY: $AWS_SECRET_ACCESS_KEY
AWS_DEFAULT_REGION: $AWS_DEFAULT_REGION
DISTRIBUTION_ID: '123xyz'
I'm deploying my react app to the s3 bucket and I have a lot of env variables, so the question is how can I handle the env variables in bitbucket pipelines?
error in pipeline
+ npm run build
> [email protected] build /opt/atlassian/pipelines/agent/build
> env-cmd -f .env.prod react-scripts build && cp build/index.html build/200.html
Error: Failed to find .env file at path: .env.prod
at getEnvFile (/opt/atlassian/pipelines/agent/build/node_modules/env-cmd/dist/get-env-vars.js:40:19)
at process._tickCallback (internal/process/next_tick.js:68:7)
at Function.Module.runMain (internal/modules/cjs/loader.js:834:11)
at startup (internal/bootstrap/node.js:283:19)
at bootstrapNodeJSCore (internal/bootstrap/node.js:623:3)
npm ERR! code ELIFECYCLE
npm ERR! errno 1
npm ERR! [email protected] build: `env-cmd -f .env.prod react-scripts build && cp build/index.html build/200.html`
npm ERR! Exit status 1
npm ERR!
npm ERR! Failed at the [email protected] build script.
npm ERR! This is probably not a problem with npm. There is likely additional logging output above.
npm ERR! A complete log of this run can be found in:
npm ERR! /root/.npm/_logs/2022-03-01T10_17_14_844Z-debug.log
bitbucket-pipeline.yml
image: node:10
# Workflow Configuration
pipelines:
default:
- parallel:
- step:
name: Build and Test
caches:
- node
script:
- npm install
# CI=true in default variables for Bitbucket Pipelines https://support.atlassian.com/bitbucket-cloud/docs/variables-in-pipelines/
branches:
master:
- parallel:
- step:
name: Build and Test
caches:
- node
script:
- npm install
# CI=true in default variables for Bitbucket Pipelines https://support.atlassian.com/bitbucket-cloud/docs/variables-in-pipelines/
- npm run build
artifacts:
- build/**
- step:
name: Security Scan
script:
# Run a security scan for sensitive data.
# See more security tools at https://bitbucket.org/product/features/pipelines/integrations?&category=security
- pipe: atlassian/git-secrets-scan:0.5.1
- step:
name: Deploy to Production
deployment: Production
trigger: manual
clone:
enabled: false
script:
# sync your files to S3
- pipe: atlassian/aws-s3-deploy:1.1.0
variables:
AWS_ACCESS_KEY_ID: $AWS_ACCESS_KEY_ID
AWS_SECRET_ACCESS_KEY: $AWS_SECRET_ACCESS_KEY
AWS_DEFAULT_REGION: $AWS_DEFAULT_REGION
S3_BUCKET: $S3_BUCKET
LOCAL_PATH: 'build'
# triggering a distribution invalidation to refresh the CDN caches
- pipe: atlassian/aws-cloudfront-invalidate:0.6.0
variables:
AWS_ACCESS_KEY_ID: $AWS_ACCESS_KEY_ID
AWS_SECRET_ACCESS_KEY: $AWS_SECRET_ACCESS_KEY
AWS_DEFAULT_REGION: $AWS_DEFAULT_REGION
DISTRIBUTION_ID: '123xyz'
如果你对这篇内容有疑问,欢迎到本站社区发帖提问 参与讨论,获取更多帮助,或者扫码二维码加入 Web 技术交流群。
绑定邮箱获取回复消息
由于您还没有绑定你的真实邮箱,如果其他用户或者作者回复了您的评论,将不能在第一时间通知您!
发布评论
评论(2)
到目前为止,我发现的最佳解决方案是使用位桶变量,并将它们传递到管道中。
支持文章中的相关片段:
https://support.atlassian.com/bitbucket-cloud/docs /变量和秘密/
The best solution I've found so far is using bitbucket variables, and passing them into the pipeline.
Relevent snippet from the support article:
https://support.atlassian.com/bitbucket-cloud/docs/variables-and-secrets/
只需在脚本中需要变量的位置添加一行并加载文件:
Simply add a line in the script where you need the variables and load the file: