使用wget 1.12 centos 6批量下载并重命名输出文件
wget 从文本文件传递 url 时,使用 file
wget -c --load-cookies cookies.txt http://www.example.com/file
wget工作正常
wget -c --load-cookies cookies.txt http://www.example.com/file.mpg -O filename_to_save_as.mpg
当我使用
wget -c --load-cookies cookies.txt -i /dir/inputfile.txt
,它按预期工作。有没有办法从文本文件传递 url,并且仍然重命名输出文件,如上面的示例 2 所示。我尝试通过参数传递 -O 选项,但 wget 告诉我“无效的 URL http://site.com/file .mpg -O new_name.mpg:方案缺失”
我也尝试过在 url、引号和格式之后进行转义,是否
url = "http://foo.bar/file.mpg" -O new_name.mpg
有任何方法可以使用输入文件并仍然使用 wget 更改输出文件名?
如果不是的话,shell 脚本会更合适吗?如果可以的话应该怎么写呢?
using file wget
wget -c --load-cookies cookies.txt http://www.example.com/file
works fine
wget -c --load-cookies cookies.txt http://www.example.com/file.mpg -O filename_to_save_as.mpg
when I use
wget -c --load-cookies cookies.txt -i /dir/inputfile.txt
to pass urls from a text file it wget it works as expected. Is there any way to pass a url from a text file and still rename the out put file as in example 2 above. I have tried passing the -O option with an argument but wget tell me "invalid URL http://site.com/file.mpg -O new_name.mpg: scheme missing"
also I have tried escaping after the url, quotes and formatting in such a way as
url = "http://foo.bar/file.mpg" -O new_name.mpg
is there any way to use an input file and still change the output file name using wget?
if not would a shell script be more appropriate? If so how should it be written?
如果你对这篇内容有疑问,欢迎到本站社区发帖提问 参与讨论,获取更多帮助,或者扫码二维码加入 Web 技术交流群。
绑定邮箱获取回复消息
由于您还没有绑定你的真实邮箱,如果其他用户或者作者回复了您的评论,将不能在第一时间通知您!
发布评论
评论(1)
我认为
wget
不支持它,但可以使用一个小的 shell 脚本来完成。首先,创建一个如下所示的输入文件 (
inputfile.txt
):网址和文件名由
tab
字符分隔。然后使用此 bash 脚本 (
wget2.sh
):使用以下命令:
更简单的解决方案是编写一个 shell 脚本,其中包含每个文件的
wget
命令:I don't think that
wget
supports it, but it's possible to do with a small shell script.First, create an input file like this (
inputfile.txt
):The url and the filename are separated by a
tab
character.Then use this bash script (
wget2.sh
):with this command:
A more simple solution is to write a shell script which contains the
wget
command for every file: