如何隐藏AWS_KEY_ID和AWS_SECRET_KEY在AWS MWAA中的渲染模板下

发布于 2025-01-30 08:30:11 字数 709 浏览 1 评论 0原文

我使用的是AWS托管气流(MWAA),并使用2.0.2版的气流。要在Snowflake中设置外部表格,我正在使用IAM用户,我需要在创建阶段语句中通过AWS_KEY_ID和AWS_SECRET_KEY。我通过从AWS Secrets Manager中阅读AWS_KEY_ID和AWS_SECRET_KEY来模板并通过AWS_KEY_ID和AWS_SECRET_KEY。以下是我的创建阶段语句的

创建或替换stage dev_stage.product_analytics.propsect_square_sftp_campaign_data0 url ='s3:// s3:// rlg-eapedw-qa-curatedzone/nrtllc/curate_zone/capeip_zone/campaire'rectientals = (aws_key_id ='{{task_instance.xcom_pull(task_ids ='read_secrets_manager',key ='step_data_1'}}'aws_secret_keykey =' file_format =(type = parquet);

但是我发现,在代码成功运行后,渲染器显示AWS_KEY_ID和AWS_SECRET_KEY。我想知道如何避免这种情况。我尝试使用airflow.utils.log.secrets_masker import mask_secret的mask_secrets,但似乎在2.0.2中不支持这一点。

想知道有人在AWS上解决了这个问题。

I am using AWS Managed Airflow (MWAA) with Airflow version 2.0.2. To setup External Tables in Snowflake I am using IAM User and I need to pass aws_key_id and aws_secret_key in Create Stage statements. I have templated my Create Stage statements and passing the aws_key_id and aws_secret_key by reading them from AWS Secrets Manager. Here's how my Create Stage statement is

CREATE OR REPLACE STAGE dev_stage.product_analytics.propsect_square_sftp_campaign_data0 URL='s3://rlg-eapedw-qa-curatedzone/nrtllc/curate_zone/campaign' credentials=
(aws_key_id='{{task_instance.xcom_pull(task_ids='read_secrets_manager', key='step_data_1')}}' aws_secret_key='{{task_instance.xcom_pull(task_ids='read_secrets_manager', key='step_data_2')}}') file_format = (TYPE=PARQUET);

But I am finding that after the code runs successfully, the Renderer shows the aws_key_id and aws_secret_key. I am wondering how to avoid this. I tried using mask_secrets from airflow.utils.log.secrets_masker import mask_secret but looks like this is not supported in 2.0.2.

Wondering anyone has solved this problem on AWS.

如果你对这篇内容有疑问,欢迎到本站社区发帖提问 参与讨论,获取更多帮助,或者扫码二维码加入 Web 技术交流群。

扫码二维码加入Web技术交流群

发布评论

需要 登录 才能够评论, 你可以免费 注册 一个本站的账号。

评论(1

℉服软 2025-02-06 08:30:11

您可以使用雪花存储集成。

遵循本指南,并配置IAM角色和策略 +雪花的集成本身。
https:///docs.snowflake。 com/en/ensure-guide/data-load-s3-config-storage-integration.html

最终您最终会得到像这样的查询:

create stage {stage_name}
storage_integration = {integration_name}
url= {'s3://...'}
file_format=(TYPE = csv SKIP_HEADER = 1);

You can use the Snowflake Storage Integration.

Follow this guide and configure the IAM roles and policies + the integration itself from snowflake.
https://docs.snowflake.com/en/user-guide/data-load-s3-config-storage-integration.html

Eventually you'll end up with a query looking like this:

create stage {stage_name}
storage_integration = {integration_name}
url= {'s3://...'}
file_format=(TYPE = csv SKIP_HEADER = 1);
~没有更多了~
我们使用 Cookies 和其他技术来定制您的体验包括您的登录状态等。通过阅读我们的 隐私政策 了解更多相关信息。 单击 接受 或继续使用网站,即表示您同意使用 Cookies 和您的相关数据。
原文