防止插入重复元素的集合工作速度是否会变慢?
防止插入重复元素的集合是否工作速度较慢(比非检查集合),因为我猜它们对每个元素进行某种类型的检查以防止重复?
或者在大多数情况下它是不正确或不能容忍的?
谢谢
Do collections that prevent inserting the duplicate elements work slower(than the non-checking ones), as I guess they implement some kind of check on each element within against duplication?
Or it is not correct or tolerable in most of the cases?
Thanks
如果你对这篇内容有疑问,欢迎到本站社区发帖提问 参与讨论,获取更多帮助,或者扫码二维码加入 Web 技术交流群。
绑定邮箱获取回复消息
由于您还没有绑定你的真实邮箱,如果其他用户或者作者回复了您的评论,将不能在第一时间通知您!
发布评论
评论(2)
当然,这取决于实施情况,但大多数集合可能会以某种形式进行优化,以快速检查遏制情况。例如,
HashSet
基本上是一个值的哈希表 - 因此它只是一个哈希查找。我不知道有任何集合会检查每个现有元素是否相等(除非您遇到可怕的哈希冲突情况等)。
It depends on the implementation of course, but most sets are likely to be optimised in some form to check for containment quickly. For example,
HashSet<T>
is basically a hash table of values - so it's just a hash lookup.I don't know of any collections which would check every existing element for equality (unless you have a horrible hash collision situation etc).
这完全取决于您正在使用的集合的实现 - 如果它基于 List。将会有性能损失。
然而,如果HashSet使用时,性能几乎相同。
尽管如此,性能不应该成为这里的动机。如果您想允许重复的项目,请使用允许重复的项目,否则使用不允许的列表。
That totally depends on the implementation of the collection you are using - if it is based on a List<T> there will be a performance penalty.
However, if a HashSet<T> is used, the performance will be almost the same.
Still, performance shouldn't be the motivation here. If you want to allow duplicate items, use a list that does, otherwise use one that doesn't.