Python 和 Java 的异步日志记录策略PHP

发布于 2024-12-11 00:23:27 字数 529 浏览 2 评论 0原文

情况是这样的:我们有一堆python脚本不断地做事并最终将数据写入mysql,我们需要一个日志来分析错误率和脚本性能。

我们还有与 mysql 数据交互的 php 前端,我们还需要记录用户操作,以便我们可以分析他们的行为,并计算一些评分函数。 所以我们想到为每种情况建立一个 mysql 表(一个用于“python 脚本”日志,一个用于“用户操作”日志)。

理想情况下,出于性能和低延迟的原因,我们将异步写入这些日志表。有没有办法在Python(我们使用django ORM)和PHP(我们使用< a href="http://www.yiiframework.com/" rel="nofollow">Yii 框架) ?

有没有更好的方法来解决这个问题?

更新: 对于用户操作(Web UI),我们现在正在考虑通过简单的 Apache 配置自动将 Apache 日志以及相关会话信息加载到 mysql 中

Here's the situation: We have a bunch of python scripts continuously doing stuff and ultimately writing data in mysql, and we need a log to analyse the error rate and script performance.

We also have php front-end that interacts with the mysql data and we also need to log the user actions so that we can analyse their behaviour, and compute some scoring functions.
So we thought of having a mysql table table for each case (one for "python scripts" log and one for "user actions" log).

Ideally, we would be writing to thsese log tables asynchronously, for performance and low-latency reasons. Is there a way to do so in Python (we are using django ORM) and in PHP (we are using Yii Framework) ?

Are there any better approaches for solving this problem ?

Update :
for the user actions, (Web UI), we are now considering loading the Apache Log into mysql with relevant session info automatically through simple Apache configuration

如果你对这篇内容有疑问,欢迎到本站社区发帖提问 参与讨论,获取更多帮助,或者扫码二维码加入 Web 技术交流群。

扫码二维码加入Web技术交流群

发布评论

需要 登录 才能够评论, 你可以免费 注册 一个本站的账号。

评论(2

貪欢 2024-12-18 00:23:27

在 PHP 中,(据我所知)只有两种异步执行任何操作的方法:

  • Fork 进程(需要 pcntl_fork)
  • exec() 一个进程并通过(假设 *nix)附加 > 来释放它/dev/null & 到命令字符串的末尾。

这两种方法都会导致创建一个新进程,尽管是暂时的,因此这是否能够提高性能是值得商榷的,并且在很大程度上取决于您的服务器环境 - 我怀疑这会让事情变得更糟,而不是更好。如果您的数据库负载非常重(因此会减慢您的速度),您可能会通过将日志消息转储到文件并使用一个守护程序脚本来爬行以进入数据库,从而获得更快的结果 - 但同样,这是否有帮助是值得商榷的。

Python 支持多线程,这让生活变得更加轻松。

There are (AFAIK) only two ways to do anything a-synchronously in PHP:

  • Fork the process (requires pcntl_fork)
  • exec() a process and release it by (assuming *nix) appending > /dev/null & to the end of the command string.

Both of these approaches result in a new process being created, albeit temporarily, so whether this would afford any performance increase is debatable and depends highly on your server environment - I suspect it would make things worse, not better. If your database is very heavily loaded (and therefore the thing that is slowing you down) you might get a faster result from dumping the log messages to file, and having a daemon script that crawls for thing to enter into the DB - but again, whether this would help is debatable.

Python supports multi-threading which makes life a lot easier.

梦毁影碎の 2024-12-18 00:23:27

您可以将原始 Unix 或网络套接字打开到日志记录服务,该服务缓存消息并将它们异步写入磁盘或数据库。如果您的 PHP 和 Python 进程长时间运行并且每次执行都会生成许多消息,那么保持打开的套接字比同步发出单独的 HTTP/数据库请求的性能更高。

您必须将其与附加到文件(打开一次,然后在运行时锁定、查找、写入和解锁,最后关闭)进行比较,看看哪个更快。

You could open a raw Unix or network socket to a logging service that caches messages and writes them to disk or database asynchronously. If your PHP and Python processes are long-running and generate many messages per execution, keeping an open socket would be more performant than making separate HTTP/database requests synchronously.

You'd have to measure it compared to appending to a file (open once then lock, seek, write, and unlock while running and close at end) to see which is faster.

~没有更多了~
我们使用 Cookies 和其他技术来定制您的体验包括您的登录状态等。通过阅读我们的 隐私政策 了解更多相关信息。 单击 接受 或继续使用网站,即表示您同意使用 Cookies 和您的相关数据。
原文