将 subprocess.Popen 的输出通过管道传输到文件
我需要使用 subprocess.Popen
启动多个长时间运行的进程,并希望每个进程自动通过管道传输 stdout
和 stderr
分隔日志文件。每个进程将同时运行几分钟,我希望每个进程写入两个日志文件(stdout
和 stderr
)作为进程跑步。
我是否需要在循环中的每个进程上不断调用 p.communicate()
才能更新每个日志文件,或者是否有某种方法来调用原始 Popen
命令这样 stdout
和 stderr
自动流式传输以打开文件句柄?
I need to launch a number of long-running processes with subprocess.Popen
, and would like to have the stdout
and stderr
from each automatically piped to separate log files. Each process will run simultaneously for several minutes, and I want two log files (stdout
and stderr
) per process to be written to as the processes run.
Do I need to continually call p.communicate()
on each process in a loop in order to update each log file, or is there some way to invoke the original Popen
command so that stdout
and stderr
are automatically streamed to open file handles?
如果你对这篇内容有疑问,欢迎到本站社区发帖提问 参与讨论,获取更多帮助,或者扫码二维码加入 Web 技术交流群。
绑定邮箱获取回复消息
由于您还没有绑定你的真实邮箱,如果其他用户或者作者回复了您的评论,将不能在第一时间通知您!
发布评论
评论(4)
您可以将
stdout
和stderr
作为参数传递给Popen()
例如
You can pass
stdout
andstderr
as parameters toPopen()
For example
根据文档,
因此,只需将打开写入的文件对象作为命名参数
stdout=
和stderr=
传递,就可以了!Per the docs,
So just pass the open-for-writing file objects as named arguments
stdout=
andstderr=
and you should be fine!我同时运行两个子进程,并将两个子进程的输出保存到一个日志文件中。我还内置了一个超时来处理挂起的子进程。当输出太大时,总是会触发超时,并且来自任一子进程的标准输出都不会保存到日志文件中。 Alex 上面提出的答案并没有解决这个问题。
I am simultaneously running two subprocesses, and saving the output from both into a single log file. I have also built in a timeout to handle hung subprocesses. When the output gets too big, the timeout always triggers, and none of the stdout from either subprocess gets saved to the log file. The answer posed by Alex above does not solve it.
根据 Alex Martelli 的回答,我创建了一个对我有用的小例子。
runit.sh
执行的程序:
在Python中分离进程
Following up on the answer of Alex Martelli I've created a small example which works for me.
runit.sh
Program which is executed:
Detaching a process in Python