亚马逊MWAA本地跑步者:在docker-compose-local.yml中添加气流变量和连接的位置

发布于 2025-01-27 08:06:42 字数 2213 浏览 2 评论 0 原文

我正在使用Amazon MWAA本地跑步库来开发和测试我在本地的DAG,然后再向Main/Dev分支提交PR。 我已经从在这里 启动容器后,我想立即导出气流变量和气流连接: ./ mwaa-local-env start 气流变量:key = depoly_environment and value = qa 气流连接:conn id = slack_conn; conn type = http; passwass = *****

像这样的东西 “气流连接”

我只能更改 docker/docker-compose-local.yml 以在文件中包括气流变量。

version: '3.7'
services:
    postgres:
        image: postgres:10-alpine
        environment:
            - POSTGRES_USER=airflow
            - POSTGRES_PASSWORD=airflow
            - POSTGRES_DB=airflow
        logging:
            options:
                max-size: 10m
                max-file: "3"
        volumes:
            - "${PWD}/db-data:/var/lib/postgresql/data"

    local-runner:
        image: amazon/mwaa-local:2.0.2
        restart: always
        depends_on:
            - postgres
        environment:
            - LOAD_EX=n
            - EXECUTOR=Local
            - AIRFLOW__CORE__SQL_ALCHEMY_CONN=postgresql+psycopg2://airflow:airflow@postgres:5432/airflow
            - AIRFLOW_VAR_DEPLOY_ENVIRONMENT=qa
        logging:
            options:
                max-size: 10m
                max-file: "3"
        volumes:
            - ${PWD}/dags:/usr/local/airflow/dags
            - ${PWD}/plugins:/usr/local/airflow/plugins
            - $HOME/.aws/credentials:/usr/local/airflow/.aws/credentials:ro
        ports:
            - "8080:8080"
        command: local-runner
        healthcheck:
            test: ["CMD-SHELL", "[ -f /usr/local/airflow/airflow-webserver.pid ]"]
            interval: 30s
            timeout: 30s
            retries: 3

我以为 airflow_var_deploy_environment = qa 可以完成工作。但是,这就是我启动气流环境后得到的。

为了添加气流连接,我无法弄清楚如何在 docker-compose-local.yml 中导出该

连接赞赏!

I am using Amazon MWAA local runner repository for developing and testing my dags locally before I submit a PR to main/dev branch.
I have forked it from here
I would like to export an Airflow variable and an Airflow connection as soon as I start the container : ./mwaa-local-env start
The Airflow variable : Key = deploy_environment and Value = qa
The Airflow connection : conn id = slack_conn ; conn type = HTTP ; password = *****

Something like thisAirflow Connection

I was only able to change the docker/docker-compose-local.yml to include the Airflow variable in the file.

version: '3.7'
services:
    postgres:
        image: postgres:10-alpine
        environment:
            - POSTGRES_USER=airflow
            - POSTGRES_PASSWORD=airflow
            - POSTGRES_DB=airflow
        logging:
            options:
                max-size: 10m
                max-file: "3"
        volumes:
            - "${PWD}/db-data:/var/lib/postgresql/data"

    local-runner:
        image: amazon/mwaa-local:2.0.2
        restart: always
        depends_on:
            - postgres
        environment:
            - LOAD_EX=n
            - EXECUTOR=Local
            - AIRFLOW__CORE__SQL_ALCHEMY_CONN=postgresql+psycopg2://airflow:airflow@postgres:5432/airflow
            - AIRFLOW_VAR_DEPLOY_ENVIRONMENT=qa
        logging:
            options:
                max-size: 10m
                max-file: "3"
        volumes:
            - ${PWD}/dags:/usr/local/airflow/dags
            - ${PWD}/plugins:/usr/local/airflow/plugins
            - $HOME/.aws/credentials:/usr/local/airflow/.aws/credentials:ro
        ports:
            - "8080:8080"
        command: local-runner
        healthcheck:
            test: ["CMD-SHELL", "[ -f /usr/local/airflow/airflow-webserver.pid ]"]
            interval: 30s
            timeout: 30s
            retries: 3

I thought AIRFLOW_VAR_DEPLOY_ENVIRONMENT=qa would do the job. However this is what I get after I start Airflow environment.
The value is Invalid!

For adding an Airflow connection, I have not been able to figure out how to export that in docker-compose-local.yml

Any help in exporting the above two is appreciated!

如果你对这篇内容有疑问,欢迎到本站社区发帖提问 参与讨论,获取更多帮助,或者扫码二维码加入 Web 技术交流群。

扫码二维码加入Web技术交流群

发布评论

需要 登录 才能够评论, 你可以免费 注册 一个本站的账号。

评论(1

紙鸢 2025-02-03 08:06:42

连接信息可以在环境变量中存储为JSON字符串。并非所有键都需要。仅提供Slack提供商模块所需的内容。

export AIRFLOW_CONN_SLACK_CONN='{
    "conn_type": "my-conn-type",
    "login": "my-login",
    "password": "my-password",
    "host": "my-host",
    "port": 1234,
    "schema": "my-schema",
    "extra": {
        "param1": "val1",
        "param2": "val2"
    }
}'

您还可以在环境变量中以URI格式存储连接信息。

export AIRFLOW_CONN_SLACK_CONN='my-conn-type://login:password@host:port/schema?param1=val1¶m2=val2'

环境变量名称必须在 airflow_conn _ 之前前缀。

参考:

The connection information can be stored as a JSON string in the environment variable. Not all of the keys are required. Only provide what's needed for the Slack provider module.

export AIRFLOW_CONN_SLACK_CONN='{
    "conn_type": "my-conn-type",
    "login": "my-login",
    "password": "my-password",
    "host": "my-host",
    "port": 1234,
    "schema": "my-schema",
    "extra": {
        "param1": "val1",
        "param2": "val2"
    }
}'

You can also store the connection information in URI format in the environment variable.

export AIRFLOW_CONN_SLACK_CONN='my-conn-type://login:password@host:port/schema?param1=val1¶m2=val2'

The environment variable name must be prefixed with AIRFLOW_CONN_.

Reference: Storing connections in environment variables (Airflow)

~没有更多了~
我们使用 Cookies 和其他技术来定制您的体验包括您的登录状态等。通过阅读我们的 隐私政策 了解更多相关信息。 单击 接受 或继续使用网站,即表示您同意使用 Cookies 和您的相关数据。
原文