python web应用程序通过管道进行日志记录? (有关表现)
我正在使用 python 和 web.py 编写一个网络应用程序,并且我想实现我自己的日志系统。我想记录有关 python 的每个请求的详细信息(静态文件由网络服务器处理)。
目前我正在考虑将日志写入管道。另一边应该有 cronolog。
我主要关心的是性能会好吗?与正常的请求处理(少于 5 次数据库查询,以及从模板生成页面)相比,传输日志所消耗的时间/资源如何?
或者还有其他更好的方法吗?我不想用python编写日志文件,因为fastcgi将启动数十个进程。
I'm writing a web app using python with web.py, and I want to implement my own logging system. I'd like to log detailed information about each request that come to python (static files are handled by web servers).
Currently I'm thinking about writing the logs to a pipe. On the other side, there should be cronolog.
My main concern is that will the performance be good? How is the time/resource consumed in piping the logs compared to the normal processing of a request (less than 5 database queries, and page generation from templates)?
Or are there other better approaches? I don't want to write the log file in python because tens of processes will be started by fastcgi.
如果你对这篇内容有疑问,欢迎到本站社区发帖提问 参与讨论,获取更多帮助,或者扫码二维码加入 Web 技术交流群。
绑定邮箱获取回复消息
由于您还没有绑定你的真实邮箱,如果其他用户或者作者回复了您的评论,将不能在第一时间通知您!
发布评论
评论(1)
管道是最快的 I/O 机制之一。它只是一个共享缓冲区。而已。如果管道的接收端完全被淹没,则可能会遇到问题。但你现在没有证据证明这一点。
如果您有 10 个由 FastCGI 启动的进程,每个进程都可以有自己独立的日志文件。这是理想的情况:使用 Python 日志记录——使每个进程都有一个唯一的日志文件。
在极少数情况下,您需要检查所有日志文件,请将它们放在一起进行分析。
Pipes are one of the fastest I/O mechanisms available. It's just a shared buffer. Nothing more. If the receiving end of your pipe is totally overwhelmed, you may have an issue. But you have no evidence of that right now.
If you have 10's of processes started by FastCGI, each can have their own independent log file. That's the ideal situation: use Python logging -- make each process have a unique log file.
In the rare event that you need to examine all log files, cat them together for analysis.