在日志文件中搜索并打印特定日期

发布于 2024-12-06 20:40:55 字数 482 浏览 0 评论 0原文

我正在处理一个日志文件,我想打印从特定日期到结束的信息。例如,该特定日期是 ($sd=27/Dec/2002)。现在我想搜索这一天并从中打印直到日志文件末尾!但如果 27/Dec 不在日志文件中的项目中怎么办?它应该搜索 items >= $sd (27/Dec) ,我该怎么做?

这段代码只是搜索 $Sd ,即 27/Dec/2002 ,我想搜索项目 >= $sd

sed -n "$(awk '/'$sd'/ {print NR}' serverlog.log.log | head -1),$ p" serveerlog.log|cut -d: -f1

日志文件示例:

213.64.237.213 - - [23/Dec/2002:03:02:22 +0100]
213.132.36.66 - - [28/Dec/2002:19:33:29 +0100]

并且日志文件已排序!

I'm working with a log file and I want to print from a specific day till the end of it . that specific date is ($sd=27/Dec/2002) for example. now I want to search for this day and print from it till the end of log file ! but what if 27/Dec is not among items in log file ? it should search for items >= $sd (27/Dec) , how could I do this?

this code just search for $Sd which is 27/Dec/2002 , I want to search for items >= $sd

sed -n "$(awk '/'$sd'/ {print NR}' serverlog.log.log | head -1),$ p" serveerlog.log|cut -d: -f1

example of log file :

213.64.237.213 - - [23/Dec/2002:03:02:22 +0100]
213.132.36.66 - - [28/Dec/2002:19:33:29 +0100]

and the log file is sorted !

如果你对这篇内容有疑问,欢迎到本站社区发帖提问 参与讨论,获取更多帮助,或者扫码二维码加入 Web 技术交流群。

扫码二维码加入Web技术交流群

发布评论

需要 登录 才能够评论, 你可以免费 注册 一个本站的账号。

评论(1

盗心人 2024-12-13 20:40:55

用 awk 就很容易了。请参阅下面的示例:

kent$  cat log.txt
213.64.237.213 - - [20/Dec/2002:03:02:22 +0100]
213.64.237.213 - - [20/Dec/2002:03:02:22 +0100]
213.64.237.213 - - [20/Dec/2002:03:02:22 +0100]
213.64.237.213 - - [20/Dec/2002:03:02:22 +0100]
213.64.237.213 - - [20/Dec/2002:03:02:22 +0100]
213.64.237.213 - - [23/Dec/2002:03:02:22 +0100]
213.64.237.213 - - [23/Dec/2002:03:02:22 +0100]
213.64.237.213 - - [23/Dec/2002:03:02:22 +0100]
213.64.237.213 - - [25/Dec/2002:03:02:22 +0100]
213.64.237.213 - - [25/Dec/2002:03:02:22 +0100]
213.64.237.213 - - [25/Dec/2002:03:02:22 +0100]
213.64.237.213 - - [25/Dec/2002:03:02:22 +0100]
213.132.36.66 - - [28/Dec/2002:19:33:29 +0100]

kent$  sd=21/Dec/2002




kent$  awk -F'[:[]' -v d=$sd '$2>d' log.txt

输出

213.64.237.213 - - [23/Dec/2002:03:02:22 +0100]
213.64.237.213 - - [23/Dec/2002:03:02:22 +0100]
213.64.237.213 - - [23/Dec/2002:03:02:22 +0100]
213.64.237.213 - - [25/Dec/2002:03:02:22 +0100]
213.64.237.213 - - [25/Dec/2002:03:02:22 +0100]
213.64.237.213 - - [25/Dec/2002:03:02:22 +0100]
213.64.237.213 - - [25/Dec/2002:03:02:22 +0100]
213.132.36.66 - - [28/Dec/2002:19:33:29 +0100]

更新

尝试此 awk 行: $sd 是变量。希望它对你有用。

kent$  awk -F'[:[]' -v vd=$sd 'BEGIN{ gsub(/\//," ",vd);"date +%s -d \""vd"\""|getline d} {p=$0;  gsub(/\//," ",$2); "date +%s -d \""$2"\""|getline o;if(o>d) print p}' log.txt

it would be very easy with awk. see the example below:

kent$  cat log.txt
213.64.237.213 - - [20/Dec/2002:03:02:22 +0100]
213.64.237.213 - - [20/Dec/2002:03:02:22 +0100]
213.64.237.213 - - [20/Dec/2002:03:02:22 +0100]
213.64.237.213 - - [20/Dec/2002:03:02:22 +0100]
213.64.237.213 - - [20/Dec/2002:03:02:22 +0100]
213.64.237.213 - - [23/Dec/2002:03:02:22 +0100]
213.64.237.213 - - [23/Dec/2002:03:02:22 +0100]
213.64.237.213 - - [23/Dec/2002:03:02:22 +0100]
213.64.237.213 - - [25/Dec/2002:03:02:22 +0100]
213.64.237.213 - - [25/Dec/2002:03:02:22 +0100]
213.64.237.213 - - [25/Dec/2002:03:02:22 +0100]
213.64.237.213 - - [25/Dec/2002:03:02:22 +0100]
213.132.36.66 - - [28/Dec/2002:19:33:29 +0100]

kent$  sd=21/Dec/2002




kent$  awk -F'[:[]' -v d=$sd '$2>d' log.txt

output

213.64.237.213 - - [23/Dec/2002:03:02:22 +0100]
213.64.237.213 - - [23/Dec/2002:03:02:22 +0100]
213.64.237.213 - - [23/Dec/2002:03:02:22 +0100]
213.64.237.213 - - [25/Dec/2002:03:02:22 +0100]
213.64.237.213 - - [25/Dec/2002:03:02:22 +0100]
213.64.237.213 - - [25/Dec/2002:03:02:22 +0100]
213.64.237.213 - - [25/Dec/2002:03:02:22 +0100]
213.132.36.66 - - [28/Dec/2002:19:33:29 +0100]

update

try this awk line: $sd is the variable. hope that it would work for you.

kent$  awk -F'[:[]' -v vd=$sd 'BEGIN{ gsub(/\//," ",vd);"date +%s -d \""vd"\""|getline d} {p=$0;  gsub(/\//," ",$2); "date +%s -d \""$2"\""|getline o;if(o>d) print p}' log.txt
~没有更多了~
我们使用 Cookies 和其他技术来定制您的体验包括您的登录状态等。通过阅读我们的 隐私政策 了解更多相关信息。 单击 接受 或继续使用网站,即表示您同意使用 Cookies 和您的相关数据。
原文