管道中的多个 grep 在完成后不会终止
我似乎遇到了一个简单的 grep 语句在完成后未完成/终止的问题。
例如:
grep -v -E 'syslogd [0-9]{1,3}\.[0-9]{1,3}\.[0-9]{1,3}: restart' |
grep -v 'printStats: total reads from cache:' /var/log/customlog.log >\
/tmp/filtered_log.tmp
上面的语句将删除内容并将它们保存到临时文件中,但是在 grep
处理完整个文件后,shell 脚本将挂起并且无法再继续。在命令行中手动运行命令时也会触发此行为。本质上,组合多个 grep 语句会导致类似 PAGER 的操作(more
/less
)。
有没有人有任何建议来克服这个限制?理想情况下,我不想执行以下操作,因为 customlog.log
文件有时可能会变得很大。
cat /var/log/customlog.log |
grep -v -E 'syslogd [0-9]{1,3}\.[0-9]{1,3}\.[0-9]{1,3}: restart' |
grep -v 'printStats: total reads from cache:' > /tmp/filtered_log.tmp
谢谢,
托尼
I'm seem to be having a problem with a simple grep
statement not finishing/terminating after it's been completed.
For example:
grep -v -E 'syslogd [0-9]{1,3}\.[0-9]{1,3}\.[0-9]{1,3}: restart' |
grep -v 'printStats: total reads from cache:' /var/log/customlog.log >\
/tmp/filtered_log.tmp
The above statement will strip out the contents and save them into a temp file, however after the grep
finishes processing the entire file, the shell script hangs and cannot proceed anymore. This behavior is also triggered when manually running the command within the command line. Essentially combining multiple grep statements causes a PAGER like action (more
/less
).
Does anyone have any suggestions to overcome this limitation? Ideally I wouldn't want to do the following giving that the customlog.log
file might get huge at times.
cat /var/log/customlog.log |
grep -v -E 'syslogd [0-9]{1,3}\.[0-9]{1,3}\.[0-9]{1,3}: restart' |
grep -v 'printStats: total reads from cache:' > /tmp/filtered_log.tmp
Thanks,
Tony
如果你对这篇内容有疑问,欢迎到本站社区发帖提问 参与讨论,获取更多帮助,或者扫码二维码加入 Web 技术交流群。
绑定邮箱获取回复消息
由于您还没有绑定你的真实邮箱,如果其他用户或者作者回复了您的评论,将不能在第一时间通知您!
发布评论
评论(1)
如上所述,您需要将文件名移至此处:
但您也可以组合两个 grep:
节省一点 CPU 并同时修复您的错误。
顺便说一句,另一个可能的问题:如果同时运行此脚本的两个实例怎么办?两者都将使用相同的临时文件。在这种特殊情况下,这可能不是问题,但您最好习惯于为这种情况开发脚本。我建议您使用
$$
将进程 ID 放入临时文件中:现在,如果两个不同的人正在运行此进程,您将无法让他们使用相同的临时文件。
附加
正如 uwe-kleine-konig 所指出的,您实际上最好使用 mktemp:
感谢您的建议。
As explained above, you need to move here your file name:
But you can also combine the two greps:
Saves a bit of CPU and will fix your error at the same time.
BTW, another possible issue: What if two instances of this script are run at the same time? Both will be using the same temp file. This probably isn't an issue in this particular case, but you might as well get used to developing scripts for that situation. I recommend that you use
$$
to put the process ID in your temporary file:Now, if two different people are running this process, you won't get them using the same temp file.
Appended
As pointed out by uwe-kleine-konig, you're actually better off using mktemp:
Thanks for the suggestion.