Godaddy 优惠券代码的 Shell 脚本 - 该脚本如何工作?
在优惠券网站上,有人发布了一个用于查找 Godaddy 的 shell 脚本折扣代码。
1 - 有人可以解释一下这个脚本是如何工作的吗?
具体来说,我对语法感到困惑:
links url -dump | grep AI
2 - shell 脚本是否允许您像 perl/python/ruby 一样抓取站点?
3 - 实现预期目标最有效的方法是perl/python/ruby 是用于此任务的更有效的技术吗?
4 - 这符合道德/合法吗?
#!/bin/sh
gdaddy=600
while [ "$gdaddy" -lt "700" ]
do
for i in a b c d e f g h i j k l m n o p q r s t u v w x y z
do
echo "The results for gdr0$gdaddy"a"$i" >> output
links http://www.godaddy.com/default.aspx?isc=gdr0$gdaddy"a"$i -dump | grep -A1 "SPECIAL OFFER" >> output
echo >> output
echo >> output
done
gdaddy=`expr $gdaddy + 1`
done
On a coupon site someone posted a shell script for finding Godaddy discount codes.
1 - Could someone explain how this script works?
Specifically, I'm confused about the syntax:
links url -dump | grep AI
2 - Does shell scripting allow you to spider a site just as perl/python/ruby would?
3 - Is the most efficient way to accomplish the desired goal or would perl/python/ruby be a more effective technology to use for this task?
4 - Is this ethical/legal?
#!/bin/sh
gdaddy=600
while [ "$gdaddy" -lt "700" ]
do
for i in a b c d e f g h i j k l m n o p q r s t u v w x y z
do
echo "The results for gdr0$gdaddy"a"$i" >> output
links http://www.godaddy.com/default.aspx?isc=gdr0$gdaddy"a"$i -dump | grep -A1 "SPECIAL OFFER" >> output
echo >> output
echo >> output
done
gdaddy=`expr $gdaddy + 1`
done
如果你对这篇内容有疑问,欢迎到本站社区发帖提问 参与讨论,获取更多帮助,或者扫码二维码加入 Web 技术交流群。
绑定邮箱获取回复消息
由于您还没有绑定你的真实邮箱,如果其他用户或者作者回复了您的评论,将不能在第一时间通知您!
发布评论
评论(4)
1.
links
是一种基于文本的网络浏览器。-dump
命令使links
将网页文本输出到终端,下面的grep
命令输出包含该单词的任何行“特别优惠”和以下行(-A1
表示“以及之后的 1 行”)。2.您可以使用 shell 脚本抓取网站,通过使用
链接
或类似方式获取网页并输出其 URL。 (我已经为网站拼写检查脚本完成了此操作。)3. 使用您最喜欢的任何工具。 就我个人而言,我更喜欢使用 Python 来完成此类任务,但正如我所说,我使用 shell 脚本来完成此任务。
4.合法吗? 询问律师。 道德吗? 问你的良心。
1.
links
is a text-based web browser. The-dump
command makeslinks
output the text of the web page to the terminal, and the followinggrep
command outputs any line that contains the words "SPECIAL OFFER" and the following line (-A1
means "and 1 line After that").2. You can spider a site using shell scripting, by using
links
or similar to fetch the web pages and output their URLs. (I've done this, for a website spell checker script.)3. Use whatever tools you're happiest with. Personally I prefer Python for this kind of thing, but as I say, I've used shell scripting to do it.
4. Legal? Ask a lawyer. Ethical? Ask your conscience.
法律和道德
robots.txt
中未引用这些页面。Legal and Ethical
robots.txt
.转储 URL 返回的内容,其中最后一个字母替换为 az,并在其中找到包含“SPECIAL OFFER”的行。 用换行符填充它。
是的,使用 links、wget、telnet 等实用程序。
对于像这样要求不高的事情(遍历一小组 URL)来说已经足够了
就这样了取决于网站的服务条款和您的立法。
Dump the content returned for URLs, where the last letter is substituted for a-z, and find a line in it containing "SPECIAL OFFER". Pad it with newline.
Yes, with utilities such as links, wget, telnet.
It is good enough for not demanding things such as this (traversing a small set of URLs)
That's up to site's terms of service and your legislation.
合法性与您居住的地方有关。 咨询法律专业人士。
道德——如果你非要问的话,事实并非如此。 =)
Legality relates to where you live. Consult a legal professional.
Ethical - if you have to ask, it's not. =)