哈希表并发
我有一个由多个线程访问的哈希表。例如,让我们看一下三个线程:
线程 A 执行 Hash.Insert("a",new object());
线程 B 执行 Hash.Insert("b",new object());
线程 C 执行 Hash.Insert("a",new object());
由于某些原因,我无法在整个哈希上使用锁
我不关心顺序或进程结束时哪个对象将位于哈希中。我唯一关心的是不要通过从不同线程更新同一单元来导致数据损坏。
我有什么选择?或者这不是一个问题,哈希表会自行处理该问题并保持数据完好无损。
I have a HashTable which is accessed by multiple threads. For example lets look at three threads:
Thread A does Hash.Insert("a",new object());
Thread B does Hash.Insert("b",new object());
Thread C does Hash.Insert("a",new object());
For some reasons, I cannot use a Lock on the entire Hash
I dont care about the order or which object will be at the hash at the end of the process. The only thing I care about is not getting data corruption by updating the same cell from different threads.
What are my options? or is it not a problem and the HashTable handles that by itself and keeps the data in tact.
如果你对这篇内容有疑问,欢迎到本站社区发帖提问 参与讨论,获取更多帮助,或者扫码二维码加入 Web 技术交流群。
绑定邮箱获取回复消息
由于您还没有绑定你的真实邮箱,如果其他用户或者作者回复了您的评论,将不能在第一时间通知您!
发布评论
评论(2)
您可以考虑使用类似的内容:
来自 System.Collections.Concurrent 命名空间。
You could consider using something like:
from the System.Collections.Concurrent namespace.
ConcurrentDictionary 应该适合你。它不是无锁的,但它不会“锁定整个哈希”,除非在某些情况下。
它使用两个集合,一个锁数组和一个哈希桶集合。
锁桶的数量可以通过设置并发级别来控制,哈希桶的初始数量可以通过设置初始容量来控制。 (这些都是构造函数参数)。
锁数组中的每个桶使用简单的模哈希覆盖多个(实际上至少一个)哈希桶。
并发字典锁定所有锁桶的唯一情况是:
除了调整大小之外,这些都是很容易避免的。
如果您可以预测字典中的最大项目数,则可以避免调整大小。
ConcurrentDictionary should work for you. It is not lock free, but it does not "lock the entire hash" except in certain situations.
It uses two collections, a lock array and a collection of hash buckets.
The number of lock buckets can be controlled by setting the concurrency level, and the initial number of hash buckets can be controlled by setting the initial capacity. (these are both constructor parameters).
Each bucket in the lock array covers several (well at least one really) hash buckets using a simple modulo hash.
The only times that the concurrent dictionary locks all the lock buckets are:
With the exception of resizing these are all easily avoidable.
Resizing can be avoided if you can predict the max number of items in the dictionary.