批处理或 VB 脚本将日志从多个目录复制到一个目录并重命名以避免冲突
希望你能帮忙。我已经尝试解决这个问题一周了,但没有取得任何进展,也无法完全拼凑出我需要的东西! - 我的脚本技巧还很差,所以请原谅我的天真! 好吧,问题...... 我有一个 IIS 服务器,它有多个站点,所有站点都将日志保存在单独的目录中,我需要将过去 24 小时的日志复制到我的计算机上的本地目录,以便我可以在 Log Parser Lizard(GUI 版本)中分析这些日志每天。 我可以通过硬件 VPN 将驱动器从远程服务器映射到本地计算机,因此这使事情变得更容易。使用 forfiles 命令,我可以重新访问目录以查找仅一天前的日志,并且使用 copy/xcopy/ 或 Robocopy 我可以设置要复制的命令。我的问题是 IIS 日志都具有相同的名称,因此我的复制命令只是不断覆盖以前的文件,而不是创建新文件。我尝试使用 %random% 参数作为文件名,但这又会创建一个随机文件,该文件会被下一个文件覆盖,保持相同的名称,而不是在一个目录中创建大量随机命名的文件。 我知道日志解析器命令包括递归,我已经成功使用了它,但是日志的格式略有改变,GUI Lizard 无法读取其中的数据,所以这不是一个解决方案。 我目前的代码如下所示,由于明显的原因,IP 发生了变化。任何帮助将不胜感激!
@echo off
NET USE Q: /Delete /yes
NET USE Q: \255.255.255.255\D$\Logs
cd C:
RD /S /QC:\Weblogs\Production
MD C:\Weblogs\Production
forfiles.exe /p Q :\ /s /m *.log /d 0 /c "cmd /c robocopy /S /XC /XN /XO @file C:\Weblogs\Production\%random%.log"
NET USE Q: /delete
exit
Hope you can help. I have been trying to resolve this for a week but not getting anywhere and can't quite piece together what I need! - My scripting skills are far from great so please forgive my naivety!
Ok, The Problem......
I have an IIS server that has multiple sites that all save their logs in a separate directory, I need to copy the logs from the last 24 hours to a local directory on my machine so I can analyse these in Log Parser Lizard (GUI Version) on a daily basis.
I can map a drive from the remote server to my local machine via a hardware VPN, so this makes things a bit easier. Using the forfiles command I can re-curse the directories to find the logs that are only a day old, and using either copy/xcopy/ or Robocopy I can set a command to copy. My problem is that the IIS logs all have the same name so my copy command just keeps overwriting the previous file, rather than creating a new file. I have tried using the %random% parameter for the file name, but this again creates one random file that is overwritten with the next file, keeping the same name instead of creating lots of randomly named files in one directory.
I know that Log Parser commands include recurse, which I have used successfully, however the format of the log is changed slightly and the GUI Lizard cannot read the data within, so this is not a solution.
My code as it stands at this time is shown below, with IP's changed for obvious reasons. Any help would be greatly appreciated!
@echo off
NET USE Q: /Delete /yes
NET USE Q: \255.255.255.255\D$\Logs
cd C:
RD /S /Q C:\Weblogs\Production
MD C:\Weblogs\Production
forfiles.exe /p Q:\ /s /m *.log /d 0 /c "cmd /c robocopy /S /XC /XN /XO @file C:\Weblogs\Production\%random%.log"
NET USE Q: /delete
exit
如果你对这篇内容有疑问,欢迎到本站社区发帖提问 参与讨论,获取更多帮助,或者扫码二维码加入 Web 技术交流群。
绑定邮箱获取回复消息
由于您还没有绑定你的真实邮箱,如果其他用户或者作者回复了您的评论,将不能在第一时间通知您!
发布评论
评论(1)
在这种情况下,
%RANDOM%
对您不起作用,因为它不会在每次迭代中得到解决,而只会在 forfiles 调用时得到一次解决。您需要在
FORFILES
中使用一些唯一标识符,也许连接@RELPATH
和@FNAME
可能适合您,以防您只有一级深度递归。或者将
FORFILES
替换为FOR
循环。在循环内,您可以更自由地计算唯一 ID,也许一个简单的计数器可能适合您。编辑:请参阅这个简单的代码示例,以帮助您入门
%RANDOM%
does not work for you in this case because it does not get solved per each iteration but only once at the forfiles invocation.You'll need either to use in
FORFILES
some unique identifier, maybe concatenating@RELPATH
and@FNAME
may work for you in case you have only one level deep recursion.Or either replace
FORFILES
with aFOR
loop. Inside the loop you may have more freedom to calculate a unique ID, maybe a simple counter might work for you.Edit: see this simple code sample, to get you started