带有 Foreach 文件枚举器选项的 Foreach 循环容器将所有文件迭代两次
我正在使用 SSIS Foreach 循环容器来迭代网络共享上具有特定模式的文件。
我遇到了循环容器的一种不可重现的故障:
有时循环会执行两次。处理完所有文件后,将从第一个文件开始。
有人遇到过类似的错误吗? 也许不是直接使用 SSIS 而是使用某种技术访问 Windows 共享上的文件? 此错误是否与某些网络问题有关?
谢谢。
I am using the SSIS Foreach Loop Container to iterate through files with a certain pattern on a network share.
I am encountering an kind of unreproducible malfunction of the Loop Container:
Sometimes the loop is executed twice. After all files were processed it starts over with the first file.
Have anyone encountered a similar bug?
Maybe not directly using SSIS but accessing files on a Windows share with some kind of technology?
Could this error relate to some network issues?
Thanks.
如果你对这篇内容有疑问,欢迎到本站社区发帖提问 参与讨论,获取更多帮助,或者扫码二维码加入 Web 技术交流群。
绑定邮箱获取回复消息
由于您还没有绑定你的真实邮箱,如果其他用户或者作者回复了您的评论,将不能在第一时间通知您!
发布评论
评论(3)
我发现在处理 Excel 文件并使用 *.xlsx 通配符驱动 foreach 时就是这种情况。
一旦我将日志记录到位,我注意到当打开 Excel 时,它会生成一个前缀为 ~$ 的 Excel 文件。这是由 foreach 循环获取的。
所以我使用了类似于 http://geekswithblogs.net/Compudicted/archive/2012/01/11/the-ssis-expression-wayndashskipping-an-unwanted-file.aspx 排除带有 ~$ 的文件在文件名中。
I found this to be the case whilst working with Excel files and using the *.xlsx wildcard to drive the foreach.
Once I put logging in place I noticed that when the Excel was opened it produced an excel file prefixed with ~$. This was picked up by the foreach loop.
So I used a trick similar to http://geekswithblogs.net/Compudicted/archive/2012/01/11/the-ssis-expression-wayndashskipping-an-unwanted-file.aspx to exclude files with a ~$ in the filename.
您收到什么错误消息(SSIS 日志/Eventvwr 消息)?
与@Siva类似,我没有遇到过这个,但你可以使用一些想法来尝试和诊断。您可能已经在做其中的一些事情,为了完整起见,我刚刚将它们写下来......
What error message (SSIS log / Eventvwr messages) do you get?
Similar to @Siva, I've not come across this, but some ideas you could use to try and diagnose. You may be doing some of these already, I've just written them down for completeness from my thought processes...
没有任何帮助 - 我实现了以下解决方法:跟踪所有文件的 foreach 迭代器中的脚本任务。如果文件已加载,则会发出警告并且不会再次处理该文件。无论如何,似乎是一些与网络相关的问题......
nothing helped - I implemented following workaround: script task in the foreach iterator which tracks all files. if a file was alread loaded a warning is fired and the file is not processed again. anyway, seems to be some network related problem...