用于减少大型日志文件的工具
我使用巨大的日志文件 - 1GB 左右,其中有许多用户会话,而我只关心一个会话。
我通常可以通过搜索会话 ID(需要 2 分钟以上)来缩小涵盖我感兴趣的会话的文件的总体范围。之后,我想删除用户会话中发生的事件之前和之后的数据,以使我的后续搜索更快(因为我现在已经缩小了感兴趣的区域)。
我喜欢在 google chrome 中加载巨大的日志文件并使用搜索突出显示功能,该功能在滚动条上显示带有标记的感兴趣区域,但它实际上不适用于大于 200MB 的文件,并且不允许我删除不相关的部分日志以使后续搜索更快。
我想这是一个常见问题。如果我能找到这样的工具,那将节省大量时间。
谢谢。
I work with huge log files - 1GB or so that have many user sessions in it while I only care about one session.
I can usually narrow-down the general area of the file that covers the session i am interested in just by searching for session id (takes 2+ minutes). After that I want to remove the data before and after the events that occurred in the user session to make my subsequent searches faster (because I have narrowed down the area of interest now).
I like to load huge log files in google chrome and use the search highlight feature which displays the area of interest with markers on the scrollbar, but it doesn't really work on files bigger than 200MB and doesn't allow me to remove irrelevant parts of the logs to make subsequent searches faster.
I imagine it is a common problem. It would be a huge time-saver if I can find such tool.
Thanks.
如果你对这篇内容有疑问,欢迎到本站社区发帖提问 参与讨论,获取更多帮助,或者扫码二维码加入 Web 技术交流群。
绑定邮箱获取回复消息
由于您还没有绑定你的真实邮箱,如果其他用户或者作者回复了您的评论,将不能在第一时间通知您!
发布评论
评论(2)
许多 UNIX 命令行工具可以帮助解决此类问题。特别是 grep 允许查找包含某些字符串或模式(会话 ID)的行。默认情况下它返回行,但我也可以返回之前或之后的 n 行。
many unix command line tools help with this kind of stuff. Especially grep allows to find lines containing some string or pattern (session id). By default it returns the row, but I can also return n rows before or after.
Splunk是一个很好的日志监控和分析工具。也许它涵盖的内容比您可能需要的要多一些,但它绝对值得一看。有一个免费许可证可用,限制为 500MB/天,如果您想全力以赴,还可以使用企业许可证 (许可证比较表)。
Splunk is a nice tool for log monitoring and analysis. Perhaps it covers a bit more ground than you might need, but it's definitely worth taking a look at. There's a free license available, which is limited to 500MB/day and an enterprise license if you want to go all out (license comparison table).