如何在启用限制的情况下从大列表中获取数据?
我们有一个包含 50.000 个项目的 SharePoint 列表,并且希望从中获取一些数据而不禁用 SP2010 的默认限制。
从MSDN有关处理大型列表的文章中,我们认为关键要素是在 SPQuery 上使用较小的 RowLimit,并使用 ListItemCollectionPosition 进行批处理。
然而,使用我们这样的代码(某些东西),仍然会触发限制异常:
SPQuery query = new SPQuery();
query.Query = "<Where><Contains><FieldRef Name=\"MatterName\" /><Value Type=\"Text\">7476922</Value></Contains></Where>";
query.ViewFields = "<FieldRef Name=\"MatterName\"/>";
query.RowLimit = 10;
int index = 0;
do
{
SPListItemCollection batch = mattersList.GetItems( query );
query.ListItemCollectionPosition = batch.ListItemCollectionPosition;
}
while ( query.ListItemCollectionPosition != null );
根据 SharePoint Connections 2010 的 MVP 专家的说法,这是设计使然,因为结果集上的隐式排序仍然会触发5000 项限制阈值。
这很好,但是我们如何从这个列表中获取数据呢?使用 ContentIterator 会更好吗选项?如果是这样,内容迭代器会发挥什么魔力来实现这一点?
We have a SharePoint list containing 50.000 items, and want to fetch some data from it without disabling SP2010's default throttling.
From the MSDN artical on handling large lists, we figured the key ingredient would be using a small RowLimit on the SPQuery, and using ListItemCollectionPosition for batching.
However, with our code (something) like this, the throttling exceptions are still triggered:
SPQuery query = new SPQuery();
query.Query = "<Where><Contains><FieldRef Name=\"MatterName\" /><Value Type=\"Text\">7476922</Value></Contains></Where>";
query.ViewFields = "<FieldRef Name=\"MatterName\"/>";
query.RowLimit = 10;
int index = 0;
do
{
SPListItemCollection batch = mattersList.GetItems( query );
query.ListItemCollectionPosition = batch.ListItemCollectionPosition;
}
while ( query.ListItemCollectionPosition != null );
According to the MVP experts at SharePoint Connections 2010, this is by design, as the implicit sort on the resultset would still trigger the 5000 item throttling threshold.
Which is nice and all, but then how do we fetch from this list? Would using a ContentIterator be a better option? If so, what is the magic the content iterator would pull off to make this happen?
如果你对这篇内容有疑问,欢迎到本站社区发帖提问 参与讨论,获取更多帮助,或者扫码二维码加入 Web 技术交流群。
绑定邮箱获取回复消息
由于您还没有绑定你的真实邮箱,如果其他用户或者作者回复了您的评论,将不能在第一时间通知您!
发布评论
评论(5)
您可以使用 ContentInterator 来帮助访问大型列表中的 5,000 多个项目,而不会达到列表限制并收到 SPQueryThrottleException。
ContentIterator 实现了一种回调模式,用于对查询进行分段,以便一次处理单个项目。如果您需要处理大量可能超出限制的项目,请考虑使用此功能
You can use ContentInterator which help with accessing more than 5,000 items in a large list without hitting a list throttling limit and receiving an SPQueryThrottleException.
ContentIterator implements a callback pattern for segmenting the query for processing a single item at a time. Consider using this capability if you need to process a large number of items that may exceed a throttling limit
ContentIterator 解决方案有一个例外:如果您的列表已建立索引(这是必需的),如果索引超过 5000 行(基于 Central Admin),即使在 contentIterator 实例开始运行之前,您仍然会遇到限制异常通过内容。
there is an exception to the ContentIterator solution: if your list is indexed (which is required), if the index has more than 5000 rows (based on Central Admin), you will still get a throttle exception even before the contentIterator instance starts to go through the content.
您可以使用:
query.QueryThrottleMode = SPQueryThrottleOption.Override;
以超级用户身份执行查询。
http://adcodes.com/sharepoint-2010-list-throtelling/
You can use:
query.QueryThrottleMode = SPQueryThrottleOption.Override;
by executing the query as a super user.
http://adicodes.com/sharepoint-2010-list-throtelling/
作为管理员,而不是开发人员,我没有适合您的代码解决方案 - 但确实有 2 个无代码“解决方案”供您考虑。
这两个设置都在 Web 应用程序级别进行调整
As an admin, not a developer, I don't have a code solution for you - but do have 2 no-code "solutions" for you to consider.
Both these settings are adjusted at the Web Application level
子句中使用的字段需要建立索引。设置索引字段也需要在限制之外进行。例如,你有一个全新的列表,设置哪些列被索引就会通过。但是,一旦该列表的项目计数超过限制阈值,设置新索引就会失败,因为限制也适用于添加索引。
Fields used inside your
<Where>
clause would need to be indexed.Setting indexed fields needs to happen outside of throttling, as well. For example, you have a brand new list, setting which columns are indexed will pass. However, once that list's item count goes above the throttling threshold, setting new indices will fail, since throttling applies to adding indices as well.