并发队列 - 一般问题(描述和用法)

发布于 2024-08-07 22:47:12 字数 373 浏览 8 评论 0原文

我在理解并发队列的概念时遇到了一些困难。我理解队列是一种 FIFO(先到先服务)的数据结构。

现在,当我们添加并发部分时,我将其解释为线程安全(请告诉我这是否不正确),事情变得有点模糊。我们所说的并发是指各种线程添加到队列中或从队列中删除(服务项目)的方式?并发是否为该操作提供了一种有序感?

我非常感谢并发队列功能的一般描述。 此处的类似帖子并不像我希望的那么普遍。

还有并发优先级队列之类的东西吗?它的用途是什么?

预先非常感谢您提供有关此主题的任何简短解释或有用的链接。

I am having some trouble grasping the idea of a concurrent queue. I understand a queue is a FIFO, or first come first serve, data structure.

Now when we add the concurrency part, which I interpret as thread safety (please let me know if that is incorrect) things get a bit fuzzy. By concurrency we mean the way various threads can add to the queue, or delete (service an item) from the queue? Is concurrency providing a sense of ordering to this operations?

I would greatly appreciate a general description of the functionality of a concurrent queue. A similar post here is not as general as I hoped.

Also is there such a thing as a concurrent priority queue? What would be its usage?

Many thanks in advance, for any brief explanations or helpful links on this subject.

如果你对这篇内容有疑问,欢迎到本站社区发帖提问 参与讨论,获取更多帮助,或者扫码二维码加入 Web 技术交流群。

扫码二维码加入Web技术交流群

发布评论

需要 登录 才能够评论, 你可以免费 注册 一个本站的账号。

评论(4

混浊又暗下来 2024-08-14 22:47:12

BlockingQueue 提供的开销很小的想法有点错误。获取锁会带来相当大的开销。仅上下文切换,我们就讨论了数千条指令。不仅如此,一个线程的进度会直接影响另一个线程。现在,它不像几年前那么糟糕,但与非阻塞相比,它是可观的。

BlockingQueue 使用锁进行互斥

ArrayBlockingQueue、LinkedBlockingQueue、PriorityBlockingQUeue:是三个阻塞队列,而

ConcurrentLinkedQueue、java 1.7 LinkedTransferQueue:使用 Michael 和 Scott 的非阻塞队列算法。

在中等到低争用的情况下(这更像是现实世界的场景),非阻塞队列的性能显着优于阻塞队列。

并注意史蒂夫关于缺乏瓶颈的评论。在严重争用的情况下,非阻塞算法可能会在不断的 cas 尝试上遇到瓶颈,而阻塞则会挂起线程。然后我们看到,在严重争用的情况下,BlockingQueue 的性能稍微优于非阻塞队列,但无论如何,这种类型的争用并不是一种常态。

The notion that a BlockingQueue offers little overhead is a bit miss leading. Acquiring a lock invokes pretty substantial overhead. Alone with the context switching we are talking thousands of instructions. Not just that but the progress of one thread will directly affect another thread. Now, its not as bad as it was years ago, but compared to non blocking, it is substantial.

BlockingQueue's use locks for mutual exclusion

ArrayBlockingQueue, LinkedBlockingQueue, PriorityBlockingQUeue: are three blocking queue's while

ConcurrentLinkedQueue, java 1.7 LinkedTransferQueue: Uses the Michael and Scott, non blocking queue algorithm.

Under moderate to low contention (which is more of a real world scenario), the non blocking queues significantly out perform blocking queues.

And to note on Steve's comment about the lack of bottlenecks. Under heavy contention a non blocking algorithm can bottle neck on the constant cas attempts, while blocking will suspend the threads. We then see that a BlockingQueue under heavy contention slightly out performs a non blocking queue, but that type of contention isn't a norm by any means.

静待花开 2024-08-14 22:47:12

我通过“并发”理解队列是线程安全的。这并不意味着它将是有效的。然而,我认为Java队列使用无锁实现,这意味着当两个线程同时尝试推送或弹出时,很少或没有惩罚。通常发生的情况是,它们在汇编程序级别使用原子锁定,以确保同一对象不能弹出两次。

我曾经写过一个无锁的 FIFO 队列(在 Delphi 中),效果非常好。比以前使用关键部分的版本效率更高。 CS 版本陷入停滞,尤其是在许多线程都试图访问队列的情况下。然而,无锁版本没有瓶颈,尽管许多线程经常访问它。

I understand by "concurrency" that the queue is thread-safe. This does not mean that it will be efficient. However, I would imagine that the Java queue use a lock-free implementation which means that there is little or no penatly when two threads attempt a push or a pop at the same time. What generally happens is that they use atomic locking at an assembler level which ensures that the same object cannot be popped twice.

I once wrote a lock-free FIFO queue (in Delphi ) which worked very well. Much more efficient that a previous version which used Critical sections. The CS version ground to a halt especially with many threads all trying to access the queue. The lock-free version however had no bottlenecks depsite many threads accessing it a lot.

述情 2024-08-14 22:47:12

您应该首先查看 BlockingQueue 接口定义,因为这是使用队列在线程之间进行通信的基石,并且包含允许生产者和消费者线程以阻塞或非阻塞方式访问队列的实用方法。这与线程安全访问一起是我对“并发队列”构成的理解(尽管我从未听说过这个短语 - BlockingQueue 仅存在于 java.util.concurrent 中 包)。

要回答问题的第二部分,您应该研究的优先级队列实现是 PriorityBlockingQueue。如果您的生产者线程正在生成不同优先级的任务(例如来自“普通用户”和“高级用户”的请求)并且您希望控制消费者线程处理任务的顺序,这可能很有用。需要避免的一个可能的陷阱是低优先级任务的饥饿,由于高优先级任务的不断涌入,这些任务永远不会从队列中删除。

You should start by checking out the BlockingQueue interface definition as this is the cornerstone for using queues for communication between threads and contains utility methods to allow producer and consumer threads to access the queue in either a blocking or non-blocking fashion. This, along with thread-safe access is my understanding of what constitutes a "concurrent queue" (although I've never heard of that phrase - BlockingQueue merely exists in the java.util.concurrent package).

To answer the second part of your question, the priority queue implementation you should study is PriorityBlockingQueue. This may be useful if your producer thread(s) are producing tasks of varying priorities (e.g. requests from "normal users" and "power users") and you wish to control the order in which tasks are processed by your consumer thread(s). One possible pitfall to avoid is the starvation of low priority tasks that are never removed from the queue due to the constant influx of higher priority tasks.

掩饰不了的爱 2024-08-14 22:47:12

只需在此处留下 链接到 <我认为 code>java.util.concurrent package 包含有关此处提出的一些问题的非常重要的信息。

请参阅:并发集合和 内存一致性属性

Just leaving here a link to the java.util.concurrent package that I think contains very important information about some questions raised here.

See: Concurrent Collections and Memory Consistency Properties

~没有更多了~
我们使用 Cookies 和其他技术来定制您的体验包括您的登录状态等。通过阅读我们的 隐私政策 了解更多相关信息。 单击 接受 或继续使用网站,即表示您同意使用 Cookies 和您的相关数据。
原文