使用Docker-Compose本地运行胶水容器
我想使用此命令$ docker run -It -v〜/.aws:/home/glue_user/.aws -e aws_profile = $ profile_name -e disable_ssl = true -rm -p 4040:4040 -p 18080:18080:18080 -name glue_pyspark amazon/aws -glue -libs:glue_libs_3.0.0.0.image_image_01 pyspark
pyspark ///docs.aws.amazon.com/glue/latest/dg/aws-glue-programming-etl-libraries.html“ rel =“ nofollow noreferrer”> pyspark-glue-container ,而不是作为<而不是<代码> Docker Run 我想以docker-compose
运行。因此,在撰写文件下创建的是:
container_name: "glue_container"
image: amazon/aws-glue-libs:glue_libs_2.0.0_image_01
environment:
DISABLE_SSL: "true"
ports:
- 4040:40404
- 18080:18080
command: pyspark
volumes:
- ${PWD}/local_path_to_workspace:/home/glue_user/workspace/
但是我无法获得胶水pyspark弹壳并运行吗?但是容器正在出现错误。
I would like to run REPL shell (Pyspark)
from the glue container using this command $ docker run -it -v ~/.aws:/home/glue_user/.aws -e AWS_PROFILE=$PROFILE_NAME -e DISABLE_SSL=true --rm -p 4040:4040 -p 18080:18080 --name glue_pyspark amazon/aws-glue-libs:glue_libs_3.0.0_image_01 pyspark
mentioned in pyspark-glue-container but instead of running as docker run
I would like to run as docker-compose
. Hence created below compose file:
container_name: "glue_container"
image: amazon/aws-glue-libs:glue_libs_2.0.0_image_01
environment:
DISABLE_SSL: "true"
ports:
- 4040:40404
- 18080:18080
command: pyspark
volumes:
- ${PWD}/local_path_to_workspace:/home/glue_user/workspace/
but I'm not able to get the glue pyspark shell up and running? but the container is exiting with an error.
如果你对这篇内容有疑问,欢迎到本站社区发帖提问 参与讨论,获取更多帮助,或者扫码二维码加入 Web 技术交流群。

绑定邮箱获取回复消息
由于您还没有绑定你的真实邮箱,如果其他用户或者作者回复了您的评论,将不能在第一时间通知您!
发布评论
评论(2)
我猜您有一个“卷必须是映射”错误
尝试此格式
,然后像这样运行
I'm guessing you got a "volume must be a mapping" error
Try this format
Then run it like this
问题是Docker组合和
pyspark
命令在容器中交互。当您直接在Docker容器中运行
Pyspark
(如下),通常会期望交互式终端会话。将以下说明添加到您的组合中可以解决该问题。
最终,这是我的撰写文件
您可以将其称为相同
,当您在容器内使用jupyter时,
The issue is how Docker Compose and the
pyspark
command interact within the container.When you run
pyspark
directly within a Docker container (like below) it typically expects an interactive terminal session.Adding the below instructions to your compose solves the issue.
This is my compose file
finally, you can call it like this
The same applies when you use jupyter inside of the container