Log4Net 和并行实例日志记录

发布于 2024-08-13 00:20:07 字数 395 浏览 4 评论 0原文

我在我的项目中使用 log4net,但有一个问题。 该程序的主要功能需要很长时间,我使用日志记录来保存有关它的信息。我使用 FileAppender 将日志保存到文件。

我的应用程序位于共享(本地)文件夹中,并且可能有多个应用程序实例从一个路径运行。在这种情况下,我只能记录第一个程序的信息,应用程序的其他实例无法记录信息,因为日志文件已锁定。

当我使用“log4net.Appender.FileAppender + MinimalLock”选项时,会出现信息丢失的情况。并非两个实例的所有日志都保存到文件中。

如何解决这个问题并记录并行实例的信息?另外,当我使用“MinimalLock”选项时,性能会下降怎么办?

谢谢。希望得到您的帮助。

I'm using log4net in my project and there is one problem.
The major function of the program takes a long time and I use logging to save information about it. I use FileAppender for saving log to file.

My application is on the shared(local) folder and there could be several instances of the application running from one path. In this case I could log information only from the first program, other instances of my applications couldn't log info because log file is locked.

When I'm using "log4net.Appender.FileAppender+MinimalLock" option there are cases of information loss. Not all logs from both instances are saved to file.

How can I solve this problem and log info from parallel instances? Also what about performance degradation when I use "MinimalLock" option?

Thanks. Hope for your help.

如果你对这篇内容有疑问,欢迎到本站社区发帖提问 参与讨论,获取更多帮助,或者扫码二维码加入 Web 技术交流群。

扫码二维码加入Web技术交流群

发布评论

需要 登录 才能够评论, 你可以免费 注册 一个本站的账号。

评论(6

怀里藏娇 2024-08-20 00:20:07

只需在日志文件名中包含应用程序的进程 ID 即可。然后,应用程序的不同实例将记录到不同的文件。这是一个例子:

<appender name="MyRollingFileAppender" type="log4net.Appender.RollingFileAppender">
  <file type="log4net.Util.PatternString">
    <conversionPattern value="log_%processid.log" />
  </file>
<!-- ... -->

Simply include the process id of the application in the log file name. Different instances of your app will then log to different files. Here is an example:

<appender name="MyRollingFileAppender" type="log4net.Appender.RollingFileAppender">
  <file type="log4net.Util.PatternString">
    <conversionPattern value="log_%processid.log" />
  </file>
<!-- ... -->
幸福%小乖 2024-08-20 00:20:07

我认为这是需要集中式日志记录解决方案的典型情况。我宁愿将日志语句异步发送到一些远程服务,该服务将负责存储和处理日志,而不是担心文件和遭受性能瓶颈。看一下这个名为 logFaces 的日志聚合器,它的设计目的是将应用程序与管理它们的应用程序解耦。日志。它应该与标准 log4net UDP 附加程序一起使用,并将按应用程序、主机、线程等对日志数据进行分区,同时让您在真正需要时随时创建日志文件。

披露:我是该产品的作者。

I think it's a typical situation when centralized logging solution is desirable. Instead of worrying about files and suffer from performance bottlenecks, I'd rather pump log statements asynchronously to some remote service which will take care of storing and handling the logs. Have a look at this log aggregator called logFaces, it was designed with the purpose of decoupling applications from managing their logs. It should work with standard log4net UDP appender and will partition your log data per applications, hosts, threads, etc.. while letting you create log files any time when they really needed.

Disclosure: I am the author of this product.

北座城市 2024-08-20 00:20:07

您可以创建一个自定义附加程序来打开要写入的文件,然后将其关闭。如果它遇到锁定的文件,它可能会暂停并重试几次。

在自定义附加程序中,您还可以以共享写入模式打开文件,该模式允许多个编写器,但这不会阻止日志行片段合并在一起。

如果您没有写入大量数据,上面列出的打开/关闭机制可能是您的最佳选择。请注意,由于文件的不断打开和关闭,如果您记录大量数据,您可能会看到明显的性能影响。

一种更复杂的机制,但可以提供高性能日志记录路径:编写一个通过 TCP 或 UDP 接收日志行的日志记录服务。该服务将负责缓冲数据并将其写入磁盘。我们过去曾使用过这种方法(不是通过 Log4Net,而是作为通用解决方案)来提高日志写入效率。

You can create a custom appender that opens the file to write and then closes it. Should it hit a locked file, it could pause and retry a small number of times.

In the custom appender you could also open the file in shared write mode that would allow multiple writers, but this won't prevent pieces of log lines from being merged together.

If you aren't writing a lot of data, the open/close mechanism, listed above, is probably your best option. Note, because of the constant opening and closing of the file, you could see a noticeable performance impact if you are logging a lot of data.

A more complicated mechanism, but one that could provide a high performance logging path: Write a logging service that receives log lines via TCP or UDP. The service would be responsible for buffering data and writing it to disk. We've used this approach in the past (not via Log4Net, but as a general solution) to improve log writing efficiency.

呆橘 2024-08-20 00:20:07

也许您从每个实例登录到不同的文件?否则,您可能需要设置一个专门用于日志记录的单独进程。程序的每个实例都会将其日志消息发送到那里,并负责将其附加到文件中。这也许可以使用 SocketAppender 来完成。另外,我发现使用 RollingFileAppender 将日志输出分解为块更容易处理。

Perhaps you log to different files from each instance? Otherwise, you'd probably need to set up a separate process that is dedicated to logging. Each instance of your program would send it's log messages there and it would take care of appending it to the file. This could be accomplished using SocketAppender, perhaps. Also, I find that breaking up log output into chucks using RollingFileAppender is much easier to deal with.

哭了丶谁疼 2024-08-20 00:20:07

一定要考虑为每个进程创建不同的日志文件,也许使用时间戳生成唯一的文件名。

Definitely consider creating different log files for each process, perhaps with unique filenames generated using a timestamp.

黒涩兲箜 2024-08-20 00:20:07

使用 InterProcessLock 代替 MinimalLock 可以减少多个进程访问单个日志文件时的数据丢失。

log4net.Appender.FileAppender+InterProcessLock

Instead of MinimalLock using InterProcessLock can reduce the data loss incase of multiple processes accessing single logfile.

log4net.Appender.FileAppender+InterProcessLock
~没有更多了~
我们使用 Cookies 和其他技术来定制您的体验包括您的登录状态等。通过阅读我们的 隐私政策 了解更多相关信息。 单击 接受 或继续使用网站,即表示您同意使用 Cookies 和您的相关数据。
原文