批量插入的 LINQ-to SQL 性能问题
我发现我的应用程序存在问题;基本上,一个子例程会准备(大量)数据,然后通过 LINQ-to SQL 数据上下文将这些数据插入到我的本地数据库中。然而,当调用 SubmitChanges()
时,即使是相对少量的新数据(100,000 左右)也需要花费大量时间才能保存到数据库中。然而,大多数时候,应用程序更有可能必须保存大约 200,000 到 300,000 行。
根据 SQL Server 的探查器,所有生成的查询如下所示,并且应用程序插入的每一项都有一个查询。
exec sp_executesql N'INSERT INTO [dbo].[AdjectivesExpanded]([Adjective], [Genus], [Casus], [SingularOrPlural], [Kind], [Form])
VALUES (@p0, @p1, @p2, @p3, @p4, @p5)
SELECT CONVERT(BigInt,SCOPE_IDENTITY()) AS [value]',N'@p0 bigint,@p1 char(1),@p2 tinyint,@p3 bit,@p4 tinyint,@p5 nvarchar(4000)',@p0=2777,@p1='n',@p2=4,@p3=0,@p4=3,@p5=N'neugeborener'
有谁知道如何使用 LINQ-to-SQL 数据上下文提高批量插入的性能,理想情况下无需摆脱强类型的 DataContext 并退回到手写查询本身?另外,几乎没有机会或空间来调整底层数据库。如果有的话,我可以禁用完整性约束(如果有帮助的话)。
I have identified a problem within my application; basically, one sub-routine prepares (lots) of data that is later on inserted into my local database via a LINQ-to-SQL data context. However, even a relatively modest amount of new data (100,000-ish) takes a tremendous amount of time to be saved into the database when SubmitChanges()
is called. Most of the time, however, it is more likely that the application has to save around 200,000 to 300,000 rows.
According to SQL Server's profiler, all generated queries look like the one below, and there's one for each item the application inserts.
exec sp_executesql N'INSERT INTO [dbo].[AdjectivesExpanded]([Adjective], [Genus], [Casus], [SingularOrPlural], [Kind], [Form])
VALUES (@p0, @p1, @p2, @p3, @p4, @p5)
SELECT CONVERT(BigInt,SCOPE_IDENTITY()) AS [value]',N'@p0 bigint,@p1 char(1),@p2 tinyint,@p3 bit,@p4 tinyint,@p5 nvarchar(4000)',@p0=2777,@p1='n',@p2=4,@p3=0,@p4=3,@p5=N'neugeborener'
Does anyone have an idea how to increase the performance of mass inserts with LINQ-to-SQL data contexts, ideally without getting rid of the stronlgy-typed DataContext and falling back to hand-written queries per se? Plus, there's little opportunity or room to tune the underlying database. If anything at all, I could disable integrity constraints, if it helps.
如果你对这篇内容有疑问,欢迎到本站社区发帖提问 参与讨论,获取更多帮助,或者扫码二维码加入 Web 技术交流群。

绑定邮箱获取回复消息
由于您还没有绑定你的真实邮箱,如果其他用户或者作者回复了您的评论,将不能在第一时间通知您!
发布评论
评论(3)
请查看下页,了解如何更改代码以使用批量插入的简单演练。
您只需将(提供的)BulkInsert 类添加到您的代码中,进行一些更改,您就会看到性能的巨大改进。
Mike 知识库 - 使用 LINQ 进行批量插入
Have a look at the following page for a simple walk-through of how to change your code to use a Bulk Insert.
You just need to add the (provided) BulkInsert class to your code, make a couple of changes, and you'll see a huge improvement in performance.
Mikes Knowledge Base - BulkInserts with LINQ
对于大规模操作来说,ORM 通常不是一个好主意。我建议使用老式的散装插入件以获得最佳性能。
ORM is usually not a good idea for mass operations. I'd recommend an old fashioned bulk insert to get the best performance.
您是否在做这样的事情:
或者:
如果它与第一个类似,我建议将其更改为第二个类似的东西。每次调用 SubmitChanges 都会检查所有跟踪的对象以了解发生了什么变化。
不管怎样,我不相信插入这么多的项目对于 Linq-to-Sql 来说是一个好主意,因为它每次都必须生成和评估 SQL。
您可以编写一个存储过程脚本并添加为设计器的 DataContext 方法吗?
Are you doing something like this:
Or:
If it is similar to the first, I would recommend changing it to something like the second. Each call to SubmitChanges is a look through all the tracked objects to see what has changed.
Either way, I'm not convinced that inserting that volume of items is a good idea for Linq-to-Sql because it has to generate and evaluate the SQL each time.
Could you script a stored procedure and add as a DataContext method for the designer?