Sharepoint 中长时间运行的工作流程是否会阻止 w3wp 进程
我们安装了带有搜索服务器的 WSS 3.0,用于搜索文档并保存搜索定义以便稍后重复搜索。 用户希望能够将搜索结果中的所有文件下载为一次性 Zip 文件。
我有一个非常基本的解决方案,当用户单击按钮时,文件的压缩是在 Web 部件中完成的,但是如果 zip 文件需要一段时间来创建,则用户将等待(我怀疑,任何其他用户访问该站点将等待,因为我想象文档的压缩是由 w3wp 进程完成的)。
我想也许我可以将 zip 文件创建作为工作流程开始,并且在工作流程完成后允许用户下载文件,但我现在意识到工作流程也在 w3wp 进程下运行。
如果工作流任务需要很长时间才能执行(例如,如果用户选择下载大量文档),是否会影响共享点站点的其他用户并阻止他们访问任何页面,直到工作流完成?
显然,我们将对用户可以压缩下载的文档的最大大小进行一些限制,这样我们就不会杀死机器,但我仍然担心无论我们设置什么限制,工作流程仍然可能结束锁定其他用户。 是这样吗? 对于创建这样一个不会影响其他用户的任务,是否有更好的建议?
谢谢
We have a WSS 3.0 installation with Search Server, which is used to search for documents and Save the search definition to repeat the search later. The users want the option to be able to download all the files in their search results as a one-off Zip file.
I have a very basic solution where the Zipping of the files is done in the web part when the user clicks on the button, but if the zip file takes a while to create the user is left waiting (and I suspect, any other users accessing the site will be waiting because I imagine the compression of the documents is being done by the w3wp process).
I thought perhaps I could kick off the zip file creation as a workflow instead, and the user be allowed to download the file once the workflow is complete, but I've now realised that workflows run under the w3wp process too.
If a workflow task is taking a long time to execute (if for example the user had picked a large number of documents to download), would it impact other users of the sharepoint site and stop them accessing any pages until the workflow has completed?
Obviously we are going to place some limitation on the maximum size of the documents the user can zip up to download so that we don't kill the machine, but I'm still worried that whatever limit we place, the workflow process could still end up locking out other users.
Is this the case?
Are there any better suggestions for creating such a task which would not affect other users?
Thanks
如果你对这篇内容有疑问,欢迎到本站社区发帖提问 参与讨论,获取更多帮助,或者扫码二维码加入 Web 技术交流群。
绑定邮箱获取回复消息
由于您还没有绑定你的真实邮箱,如果其他用户或者作者回复了您的评论,将不能在第一时间通知您!
发布评论
评论(3)
在工作流中将延迟活动放在执行 ZIP 创建的活动之前。 这会将工作流从交互式 W3WP 进程推送到 WSSTimerV3 服务,因为它需要在将来运行。
问候,
保罗
http://blogs.msdn.com/pandrew
Put a delay activity in the workflow before the activity that does the ZIP creation. This will push the workflow from the interactive W3WP process to the WSSTimerV3 service since it needs to run at a future time.
Regards,
Paul
http://blogs.msdn.com/pandrew
即使您在 Web 部件中进行压缩,也不会阻止其他用户。 处理请求的 w3wp 工作线程在等待 zip 完成时被阻塞,但所有其他工作线程都能够继续。
尽管如此,如果有许多等待压缩线程,这可能会成为一个可扩展性问题:最终,传入请求可能会阻塞等待工作线程变得可用。 这就是在 ASP.NET 中使用异步处理的原因。
使用工作流会有所帮助,因为工作流已启动,并且请求已完成,从而允许处理其他请求。
您担心 w3wp 中运行的工作流程。 但是,我不知道它是在 w3wp 内的工作线程之一上运行的。 我不知道 SharePoint 如何配置其工作流主机,但我怀疑它使用一组不同的线程。
您可能需要对此进行一些负载测试来找出答案。 创建一个虚拟 Web 部件,当您请求包含该压缩的页面时,它会立即运行该压缩。 对该页面进行加载,并找出在开始排队等待工作线程可用之前可以收到多少个请求。 然后执行相同的操作,但让 Web 部件启动工作流程。 再次,查看在请求开始排队之前可以同时运行多少个请求。
Even when you were doing the zip in the web part, you were not blocking other users. The w3wp worker thread that processed the request was blocked waiting for the zip to complete, but all the other worker threads were able to continue.
Still, this could become a scalability issue if there were many waiting-for-zip threads: eventually, incoming requests might have blocked waiting for worker threads to become available. That's a reason to use asynchronous processing in ASP.NET.
Using a workflow will have helped, because the workflow was kicked off, and the request completed, permitting other requests to be processed.
You were concerned about the workflow running in w3wp. However, I don't know that it was running on one of the worker threads within w3wp. I don't know how SharePoint configures its workflow host, but I suspect that it uses a different set of threads.
You might want to do some load testing of this to find out. Create a dummy web part that just runs a zip as soon as you request the page containing it. Run up a load to that page and find out how many requests you can get before they begin to queue up waiting for worker threads to become available. Then do the same thing, but have the web part kick off the workflow. Again, see how many requests you can run at the same time before requests start to queue up.
如果压缩时间少于 5 秒,我会在同一个线程中同步执行并完成它。 最低的复杂性、最佳的用户体验、不会阻塞其他用户(受 ASP.NET 线程池大小的限制)。 不过,一堆点击会杀死服务器。
如果您有大文件或大量流量,您可以在数据库中保留先进先出队列,并让 Windows 服务将它们取出并执行。 这样您就可以控制用于压缩文件的线程数。 该解决方案的算法复杂度为 O(1),但大大增加了设计的复杂度。 您可能需要考虑使用 AJAX 之类的东西来告诉用户“您是队列中 45 个中的 4 个...”。
如果文件大小变化很大,您可能需要将前两个解决方案作为策略实施,并通过查看解压缩文件大小和服务器资源可用性等因素来实施第三个自适应策略,该策略遵循前面提到的策略之一。 用户体验和资源可用性之间的良好折衷,但最复杂(昂贵)。
格罗特,
汉斯
If the zipping takes less than 5 seconds I would just do it synchronously in the same thread and be done with it. Least complexity, best user experience, no blocking of other users (limited by the ASP.NET thread pool size). A bunch of clicks will kill the server though.
If you have large files or lots of traffic you could persist a First In First Out queue in a database and have a Windows service take them out and execute them. This way you have control over the thread count used to zip files. This solution gives you O(1) algorithmic complexity but greatly increases the complexity of the design. You may want to consider using something like AJAX to tell the user "You are 4 of 45 in the queue...".
If you have a large variation of file sizes you may want to implement the top two solutions as strategies and implement a third adaptive strategy that defers to one of the previously mentioned strategies by looking at things like unzipped file size and server resource availability. Good compromise between user experience and resource availability but the most complex (expensive).
Groete,
Hans