为 awk 编写 shell 包装脚本

发布于 2024-08-17 07:48:14 字数 684 浏览 3 评论 0原文

我想在 shell 脚本中嵌入 awk 脚本,但我很难做到这一点,因为我不知道在哪里用 ; 结束语句。哪里没有。

这是我正在使用 nawk 的脚本

#!/bin/sh

awk='

BEGIN {FS = ",?+" }

# removes all backspaces preceded by any char except _
function format() {
    gsub("[^_]\b", "")
}

function getOptions() {
    getline
    format() 

    print
}

{
    format()

    if ($0 ~ /^SYNOPSIS$/ {
        getOptions()
        next            
    }

    if ($0  /^[ \t]+--?[A-Za-z0-9]+/) {
        print $0
    }
}

END { print "\n" }'

path='/usr/share/man/man1'
list=$(ls $path)

for item in $list
do
    echo "Command: $item"
    zcat $path$item | nroff -man | awk "$awk"
done > opts

顺便说一句,

。提前致谢

I want to embed an awk script inside a shell script but I have trouble to do so as I don't know where to end a statement with a ; and where not.

Here's my script

#!/bin/sh

awk='

BEGIN {FS = ",?+" }

# removes all backspaces preceded by any char except _
function format() {
    gsub("[^_]\b", "")
}

function getOptions() {
    getline
    format() 

    print
}

{
    format()

    if ($0 ~ /^SYNOPSIS$/ {
        getOptions()
        next            
    }

    if ($0  /^[ \t]+--?[A-Za-z0-9]+/) {
        print $0
    }
}

END { print "\n" }'

path='/usr/share/man/man1'
list=$(ls $path)

for item in $list
do
    echo "Command: $item"
    zcat $path$item | nroff -man | awk "$awk"
done > opts

I'm using nawk by the way.

Thanks in advance

如果你对这篇内容有疑问,欢迎到本站社区发帖提问 参与讨论,获取更多帮助,或者扫码二维码加入 Web 技术交流群。

扫码二维码加入Web技术交流群

发布评论

需要 登录 才能够评论, 你可以免费 注册 一个本站的账号。

评论(3

半暖夏伤 2024-08-24 07:48:14

据我所知,有几个问题:

  1. 您没有关闭分配给 $awk 的多行字符串。您需要在 END { ... } 之后的行上使用单引号
  2. 您似乎实际上并没有在任何地方使用 $awk 。也许您的意思是在 do 循环内调用 awk。
  3. 一旦解决了这些问题,awk 通常对分号相当宽容,但是这方面的任何问题都与在 shell 脚本中使用分号无关。

There are several things wrong, as far as I can see:

  1. You don't close the multi-line string being assigned to $awk. You need a single quote on the line after END { ... }
  2. You don't seem to actually use $awk anywhere. Perhaps you meant on the invocation of awk inside the do loop.
  3. Once you fix those issues, awk is usually fairly forgiving about semicolons, but any problems in that regard don't have anything to do with using it inside a shell script.
素食主义者 2024-08-24 07:48:14

这三行:

path='/usr/share/man/man1'

list=$(ls $path)

for item in $list

需要更改为:

path='/usr/share/man/man1'

for item in $path/*

如果文件名中有空格,并且 ls 不打算以这种方式使用。

These three lines:

path='/usr/share/man/man1'

list=$(ls $path)

for item in $list

Need to be changed into:

path='/usr/share/man/man1'

for item in $path/*

in case there are spaces in filenames and since ls is not intended to be used in this way.

故笙诉离歌 2024-08-24 07:48:14

我不太确定你的意思,但如果我理解正确的话,你的 showOpts.awk 是脚本开头的 awk 代码,所以你可以这样做,

path='/usr/share/man/man1'
list=$(ls $path)

for item in $list
do
    echo "Command: $item"
    zcat $path$item | nroff -man | nawk ' BEGIN {FS = ",?+" }
# removes all backspaces preceded by any char except _
function format() {
    gsub("[^_]\b", "")
}

function getOptions() {
    getline
    format()

    print
}

{
    format()

    if ($0 ~ /^SYNOPSIS$/ {
        getOptions()
        next
    }

    if ($0  /^[ \t]+--?[A-Za-z0-9]+/) {
        print $0
    }
}

END { print "\n" } '
done >> opts

你可能应该使用 >>而不是 > 。

i am not really sure what you meant, but if i understand you correctly, your showOpts.awk is that awk code at the beginning of your script, so you could do this

path='/usr/share/man/man1'
list=$(ls $path)

for item in $list
do
    echo "Command: $item"
    zcat $path$item | nroff -man | nawk ' BEGIN {FS = ",?+" }
# removes all backspaces preceded by any char except _
function format() {
    gsub("[^_]\b", "")
}

function getOptions() {
    getline
    format()

    print
}

{
    format()

    if ($0 ~ /^SYNOPSIS$/ {
        getOptions()
        next
    }

    if ($0  /^[ \t]+--?[A-Za-z0-9]+/) {
        print $0
    }
}

END { print "\n" } '
done >> opts

and you should probably use >> instead of > .

~没有更多了~
我们使用 Cookies 和其他技术来定制您的体验包括您的登录状态等。通过阅读我们的 隐私政策 了解更多相关信息。 单击 接受 或继续使用网站,即表示您同意使用 Cookies 和您的相关数据。
原文