在MWAA环境中使用S3FiletransFormerator

发布于 2025-02-01 14:26:29 字数 214 浏览 2 评论 0 原文

我正在尝试在MWAA环境中使用S3FiletransFormerator,但是我在脚本文件中缺乏许可:

PermissionError: [Errno 13] Permission denied

我试图在任务前将带有CHMOD命令的bash运算符添加,但没有成功。

有没有人在MWAA中使用过S3FiletransFormerator?

I'm trying to use the S3FileTransformOperator in a MWAA enviroment, but I'm suffering from lack of permission in the script file:

PermissionError: [Errno 13] Permission denied

I tried to add a Bash operator with the chmod command before the task, but without success.

Has anyone ever used the S3FileTransformOperator in MWAA?

如果你对这篇内容有疑问,欢迎到本站社区发帖提问 参与讨论,获取更多帮助,或者扫码二维码加入 Web 技术交流群。

扫码二维码加入Web技术交流群

发布评论

需要 登录 才能够评论, 你可以免费 注册 一个本站的账号。

评论(2

巷子口的你 2025-02-08 14:26:29
Hi!

Unfortunately with MWAA the worker containers are both ephemeral and limited to user level access. The S3 operator should work with .sh files added to the /dags folder and referred to in the operator as /usr/local/airflow/dags/my_script.sh. The alternative would be to use the contents of your .py file from a Python operator and use the S3Hook to retrieve and store the file.

Thanks!

参考: https> https'> https'> https:// repost。 AWS/QUASION/QUVSBZDS_NQTG7JSXCQ1DJQ/s-3-file-transform-operator-permission-permission-permission-indied-on-script

Hi!

Unfortunately with MWAA the worker containers are both ephemeral and limited to user level access. The S3 operator should work with .sh files added to the /dags folder and referred to in the operator as /usr/local/airflow/dags/my_script.sh. The alternative would be to use the contents of your .py file from a Python operator and use the S3Hook to retrieve and store the file.

Thanks!

Reference: https://repost.aws/questions/QUvsbZds_NQTG7JSxCQ11djQ/s-3-file-transform-operator-permission-denied-on-script

笑脸一如从前 2025-02-08 14:26:29

我尝试了这一点,但第一步又有了另一个错误,

from airflow.operators.bash import BashOperator
from airflow.operators.s3_file_transform_operator import S3FileTransformOperator

with DAG(...) as dag:
    chmod = BashOperator(
        task_id="chmod",
        bash_command="chmod +x /usr/local/airflow/dags/transform.py"
    )
    transform = S3FileTransformOperator(
        task_id="transform",
        source_s3_key="s3://bucket/path/to/sample.csv",
        dest_s3_key="s3://bucket/path/to/result.csv",
        transform_script="/usr/local/airflow/dags/transform.py",
    )

    chmod >> expose
chmod: changing permissions of ‘/usr/local/airflow/dags/transform.py’: Read-only file system

AWS的另一项引用表明可以运行Bash脚本转换,但是我对Python没有任何成功。

I tried this but got another error at first step,

from airflow.operators.bash import BashOperator
from airflow.operators.s3_file_transform_operator import S3FileTransformOperator

with DAG(...) as dag:
    chmod = BashOperator(
        task_id="chmod",
        bash_command="chmod +x /usr/local/airflow/dags/transform.py"
    )
    transform = S3FileTransformOperator(
        task_id="transform",
        source_s3_key="s3://bucket/path/to/sample.csv",
        dest_s3_key="s3://bucket/path/to/result.csv",
        transform_script="/usr/local/airflow/dags/transform.py",
    )

    chmod >> expose
chmod: changing permissions of ‘/usr/local/airflow/dags/transform.py’: Read-only file system

Other reference from AWS showing it is possible to run a bash script transformation, but I haven't had any success with Python.

https://docs.aws.amazon.com/mwaa/latest/userguide/t-apache-airflow-202.html#op-s3-transform

~没有更多了~
我们使用 Cookies 和其他技术来定制您的体验包括您的登录状态等。通过阅读我们的 隐私政策 了解更多相关信息。 单击 接受 或继续使用网站,即表示您同意使用 Cookies 和您的相关数据。
原文