使用 Parallel.Foreach 的最短项目处理时间

发布于 2024-12-07 04:18:34 字数 106 浏览 1 评论 0原文

假设我有一个当前在正常 foreach 循环中处理的项目列表。假设项目数量明显大于核心数量。根据经验,在考虑将 for 循环重构为 Parallel.ForEach 之前,每个项目应该花费多少时间?

Suppose I have a list of items that are currently processed in a normal foreach loop. Assume the number of items is significantly larger than the number of cores. How much time should each item take, as a rule of thumb, before I should consider refactoring the for-loop into a Parallel.ForEach?

如果你对这篇内容有疑问,欢迎到本站社区发帖提问 参与讨论,获取更多帮助,或者扫码二维码加入 Web 技术交流群。

扫码二维码加入Web技术交流群

发布评论

需要 登录 才能够评论, 你可以免费 注册 一个本站的账号。

评论(2

给不了的爱 2024-12-14 04:18:34

这是并行编程的核心问题之一。为了得到准确的答案,您仍然需要在确切的情况下进行测量。

然而,TPL 的一大优点是阈值比以前小很多,并且当您的工作项太小时,您不会受到(那么多)惩罚。

我曾经制作过一个包含 2 个嵌套循环的演示,我想表明只有外部循环才应该并行运行。但该演示未能显示将两者都转换为 Parallel.For() 的显着缺点。

因此,如果循环中的代码是独立的,那就继续吧。

#it​​ems / #cores 比率不是很相关,TPL 将划分范围并使用“正确”数量的线程。

This is one of the core problems of parallel programming. For an accurate answer you would still have to measure in the exact situation.

The big advantage of the TPL however is that the treshold is a lot smaller than it used to be, and that you're not punished (as much) when your workitems are too small.

I once made a demo with 2 nested loops and I wanted to show that only the outer one should be made to run in parallel. But the demo failed to show a significant disadvantage of turning both into a Parallel.For().

So if the code in you loop is independent, go for it.

The #items / #cores ratio is not very relevant, TPL wil partition the ranges and use the 'right' amount of threads.

奢望 2024-12-14 04:18:34

在一个大型数据处理项目中,我正在处理的任何包含两个或三个以上语句的循环都从 Parallel.Foreach 中受益匪浅。如果您的循环正在处理的数据是原子的,那么与并行库提供的巨大好处相比,我认为没有什么缺点。

On a large data processing project I'm working on any loop that I used that contained more than two or three statements benefited greatly from the Parallel.Foreach. If the data your loop is working on is atomic then I see very little downside compared to the tremendous benefit the Parallel library offers.

~没有更多了~
我们使用 Cookies 和其他技术来定制您的体验包括您的登录状态等。通过阅读我们的 隐私政策 了解更多相关信息。 单击 接受 或继续使用网站,即表示您同意使用 Cookies 和您的相关数据。
原文