C# 列表>>移除物品时是否应该降低容量?
我有一个 List 容器,一开始可能最多包含 100,000 个项目。当程序运行时,此列表将慢慢清空,我应该在清空列表时更改容量吗?
我做了一些测试,执行时间似乎是相同的,但是降低列表的容量有很多开销吗?我可以找到很多有关增加容量的信息,但很少找到有关降低容量的信息。
I have a List container which can potentially have up to 100,000 items in to start with. While the program is running this list will slowly empty, should I alter the capacity as I empty the list?
I have done some testing and execution time seems to be the same, but is there much overhead to lowering the capacity of a list? I can find lots of information about increasing the capacity but not much on lowering it.
如果你对这篇内容有疑问,欢迎到本站社区发帖提问 参与讨论,获取更多帮助,或者扫码二维码加入 Web 技术交流群。
绑定邮箱获取回复消息
由于您还没有绑定你的真实邮箱,如果其他用户或者作者回复了您的评论,将不能在第一时间通知您!
发布评论
评论(3)
除非您的内存量非常低,否则这是一种微优化。
通常,不需要更改
List
的容量。来自
TrimExcess
方法文档:Unless you have a very low amount of memory, this is a micro-optimization.
Normally, there is no need to change the capacity of a
List<>
.From the
TrimExcess
method documentation:计算一下:100,000 个项目 * 每个项目 4 字节 = 大约 400KB。如果这对您的程序来说内存开销太大,您可以调用 TrimExcess,正如 Oded 指出的
一旦列表变小,就重新创建较小的列表。 (我不确定减少容量是否真的会产生您想要的效果。)Do the math: 100,000 items * 4 bytes per item = roughtly 400KB. If that's too much memory overhead for your program, you can call TrimExcess, as Oded points out
recreate smaller lists once it gets smaller. (I'm not sure that reducing the capacity will actually have the effect you're going for.)降低列表的容量涉及新的后备数组并复制数据,因此这是一个相对昂贵的操作。
在你的特殊情况下,我会说这是不值得的,除非你开始遇到内存问题。
如果它成为一个真正的问题,可以采用的一种策略是创建一个
IList
的“分块”实现,它不使用一个数组,而是使用多个数组,每个数组都有预先配置的大小,并附加当前面的块填满时添加块(固定大小的数组)。这还允许通过在删除项目时释放未使用的块来相对便宜地缩小列表,同时将内存开销最小化为仅一个非完整块(最后一个)。不过,这种方法会增加列表上所有操作的性能开销,因为列表必须计算项目所在的块并根据需要创建新块。因此,除非您确实存在内存问题并且列表的大小确实随着时间的推移而发生显着变化,否则它没有用。
Lowering the capacity of a list involves a new backing array and copying the data across, so it's a relatively expensive operation.
In your particular case I would say it is not worth it unless you start hitting memory problems.
One strategy that can be employed if it were to become a real problem is to create a 'chunked' implementation of
IList<>
which uses not one array, but multiple, each of preconfigured size , with additional chunks (fixed size arrays) added as the previous fills up. This also allows the list to shrink relatively inexpensively by releasing unused chunks as items are removed, whilst minimizing the memory overhead to just one non-full chunk (the last).This approach adds a performance overhead to all operations on the list though, as the list has to calculate which chunk an items resides and create new chunks as required. So it is not useful unless you truly have a memory problem and a list that truly changes size dramatically over time.