在子进程的后台运行命令并捕获日志
我正在使用以下命令在子进程 python 中使用 nohup 在后台运行命令,但我无法将日志重定向到其他文件而不是 nohup.out。
当从 shell 运行以下命令时,它工作正常并重定向日志,但是当从 python 子进程运行时,它不会创建这些日志文件
nohup /usr/bin/spark-submit --class com.gdl.app。 TestSparkStreamingSleep --master yarn --deploy-mode client --conf Spark.eventLog.enabled=true gdl-spark-spd-1.0.jar 沙箱 >日志/sparkdetailed_SPD_TestSparkStreamingSleep_20220318_115937.log.out 2>日志/sparkdetailed_SPD_TestSparkStreamingSleep_20220318_115937.log.err < /dev/null &
python subprocess code
cmd = "nohup /usr/bin/spark-submit --class com.gdl.app.TestSparkStreamingSleep --master yarn --deploy-mode client --conf spark.eventLog.enabled=true gdl-spark-spd-1.0.jar sandbox > logs/sparkdetailed_SPD_TestSparkStreamingSleep_20220318_115937.log.out 2> logs/sparkdetailed_SPD_TestSparkStreamingSleep_20220318_115937.log.err < /dev/null &"
res = subprocess.Popen(cmd, stdout=None, stderr=None, close_fds=True, preexec_fn=os.setsid)
pid = os.getpgid(res.pid)
logger.info("Process ID : {}".format(pid))
注意:我在下面尝试
- 将命令转换为类似
['nohup', '/usr/bin/spark-submit', '--class' 的列表, 'com.gdl.app.TestSparkStreamingSleep', '--master', 'yarn', '--deploy-mode', '客户端', '--conf', 'spark.eventLog.enabled=true', 'gdl-spark-spd-1.0.jar', 'sandme', '>', 'logs/sparkdetailed_SPD_TestSparkStreamingSleep_20220318_111813.log', '2>&1', '& ']
- 我累了
nohup cmd > mylog.log 2>&1 &
- 我也在子进程中尝试过
stdout=subprocess.PIPE, stderr=subprocess.PIPE
,我知道这不会有帮助,但也尝试了这个。
I am running a command is background using nohup in subprocess python using below command, but I am not able to redirect the logs to a different file instead of nohup.out.
When below command is ran from shell it tis working fine and redirecting the logs, but when ran from python subprocess it is not creating those log files
nohup /usr/bin/spark-submit --class com.gdl.app.TestSparkStreamingSleep --master yarn --deploy-mode client --conf spark.eventLog.enabled=true gdl-spark-spd-1.0.jar sandbox > logs/sparkdetailed_SPD_TestSparkStreamingSleep_20220318_115937.log.out 2> logs/sparkdetailed_SPD_TestSparkStreamingSleep_20220318_115937.log.err < /dev/null &
python subprocess code
cmd = "nohup /usr/bin/spark-submit --class com.gdl.app.TestSparkStreamingSleep --master yarn --deploy-mode client --conf spark.eventLog.enabled=true gdl-spark-spd-1.0.jar sandbox > logs/sparkdetailed_SPD_TestSparkStreamingSleep_20220318_115937.log.out 2> logs/sparkdetailed_SPD_TestSparkStreamingSleep_20220318_115937.log.err < /dev/null &"
res = subprocess.Popen(cmd, stdout=None, stderr=None, close_fds=True, preexec_fn=os.setsid)
pid = os.getpgid(res.pid)
logger.info("Process ID : {}".format(pid))
Note: I have tried below
- converting the command into a list like
['nohup', '/usr/bin/spark-submit', '--class', 'com.gdl.app.TestSparkStreamingSleep', '--master', 'yarn', '--deploy-mode', 'client', '--conf', 'spark.eventLog.enabled=true', 'gdl-spark-spd-1.0.jar', 'sandme', '>', 'logs/sparkdetailed_SPD_TestSparkStreamingSleep_20220318_111813.log', '2>&1', '&']
- I have tired
nohup cmd > mylog.log 2>&1 &
- I have also tried
stdout=subprocess.PIPE, stderr=subprocess.PIPE
in subprocess, I know it wont help but tried this one aswell.
如果你对这篇内容有疑问,欢迎到本站社区发帖提问 参与讨论,获取更多帮助,或者扫码二维码加入 Web 技术交流群。

绑定邮箱获取回复消息
由于您还没有绑定你的真实邮箱,如果其他用户或者作者回复了您的评论,将不能在第一时间通知您!
发布评论