使用 Entity Framework 4 将大量行插入 SQL CE 4.0(性能问题)

发布于 2024-11-29 05:54:29 字数 1377 浏览 0 评论 0原文

我有一个小型 SQL CE 4.0 数据库,其中包含多个表,使用实体框架 4 进行映射。

这是我的代码

foreach (String designation in newItemDesignations)
{
    ParameterData defaultValue = PDM.GetDefaultParameterData(designation);

    // Fill the ItemParameterDBO object with the data
    ItemParameterDBO dbParam = new ItemParameterDBO();
    dbParam.ItemID = dbItem.ID;
    dbParam.Designation = designation;    
    dbParam.High = defaultValue.High;
    dbParam.Low = defaultValue.Low;

    database.ItemParameterDBOes.AddObject(dbParam);
}

database.SaveChanges();

此代码发生 24 次,每次 newItemDesignations 列表包含正好 525 个元素。总共需要添加 12600 行。

整个过程持续509秒。我想这对于 12600 行来说太多了。

我知道我调用了 SaveChanges 24 次。目前,应用程序设计不允许我将所有插入放入单个事务中(使用 SaveChanges)。然而,看看单笔交易会发生什么。 509 / 24 = 21 秒,或每行 40 毫秒

  • 40 毫秒是通过 EF4 插入行的正常(平均)时间吗?

我已经检查了我的其他代码(除了添加到数据库和保存更改之外)。所有 12600 行总共需要 100 毫秒。这是完整时间的 0.01%,所以这显然不是问题。 99.99% 的处理时间花在 EF4 AddObjectSaveChanges 上。

我还知道我正在设置作为外键的 ItemID 属性。这是一个整数,所以我想它应该没什么关系。但我不知道。

另请注意:任何表上都没有设置索引(主键/外键除外)

  • 我在这里做错了什么,为什么这么慢?
  • 这是插入那么多行所需的正常时间还是与 SQL CE 4 相关的某种性能限制?

I have a small SQL CE 4.0 database with several tables, mapped using Entity Framework 4.

Here is the code I have

foreach (String designation in newItemDesignations)
{
    ParameterData defaultValue = PDM.GetDefaultParameterData(designation);

    // Fill the ItemParameterDBO object with the data
    ItemParameterDBO dbParam = new ItemParameterDBO();
    dbParam.ItemID = dbItem.ID;
    dbParam.Designation = designation;    
    dbParam.High = defaultValue.High;
    dbParam.Low = defaultValue.Low;

    database.ItemParameterDBOes.AddObject(dbParam);
}

database.SaveChanges();

This code happens 24 times and each time the newItemDesignations list contains exactly 525 elements. That's a total of 12600 rows to add.

The complete process lasts 509 seconds. I guess that's too much for 12600 rows.

I am aware that I am calling SaveChanges 24 times. At the moment, the application design does not allow me to put all inserts into a single transaction (with SaveChanges). However, take a look at what happens with the single transaction. 509 / 24 = 21 seconds, or a 40 ms per row.

  • Is 40 ms the normal (avg) time for a row to be inserted via EF4?

I've checked my other code (other than adding to the database and saving changes). It takes total of 100 ms for all 12600 rows. That's 0.01% of complete time, so that's obviously not the problem. The 99.99% of the processing time is spent in EF4 AddObject and SaveChanges.

I am also aware that I am setting the ItemID property which is a foreign key. This is an integer so I guess it shouldn't matter much. But I wouldn't know.

Also note: there are no indexes set on any of the tables (except for primary/foreign keys)

  • What am I doing wrong here, why is this so slow?
  • Is this the normal time needed for inserting that much rows or is this some kind of performance restrictions related to SQL CE 4?

如果你对这篇内容有疑问,欢迎到本站社区发帖提问 参与讨论,获取更多帮助,或者扫码二维码加入 Web 技术交流群。

扫码二维码加入Web技术交流群

发布评论

需要 登录 才能够评论, 你可以免费 注册 一个本站的账号。

评论(3

南烟 2024-12-06 05:54:29

由于示例很少,这里是我测试过的代码,它运行得很好。感谢 ErikEJ 的 SqlCeBulkCopy 库。必须有。

DataTable table = new DataTable();

table.Columns.Add(new DataColumn("A", typeof(int)));
table.Columns.Add(new DataColumn("B", typeof(String)));
table.Columns.Add(new DataColumn("C", typeof(Byte)));

for(int i = 0; i < 12000; i++)
{
    DataRow row = table.NewRow();
    row["A"] = "124324"
    row["B"] = "something";
    row["C"] = 15;

    table.Rows.Add(row);
}

String connString = @"Data Source = C:\Database.sdf";
SqlCeBulkCopy bulkInsert = new SqlCeBulkCopy(connString);
bulkInsert.DestinationTableName = "Items";
bulkInsert.WriteToServer(table);

插入我的 (OP) 12600 行花费了不到 2 秒的时间。

这个例子很糟糕,它不是强类型的,但它描述了如何手动创建数据表并使用 SqlCeBulkCopy 库将其插入数据库(请参阅接受链接的答案)。

Since the examples are scarce, here is the code I tested and it worked flawlessly. Thanks to ErikEJ's SqlCeBulkCopy library. A must have.

DataTable table = new DataTable();

table.Columns.Add(new DataColumn("A", typeof(int)));
table.Columns.Add(new DataColumn("B", typeof(String)));
table.Columns.Add(new DataColumn("C", typeof(Byte)));

for(int i = 0; i < 12000; i++)
{
    DataRow row = table.NewRow();
    row["A"] = "124324"
    row["B"] = "something";
    row["C"] = 15;

    table.Rows.Add(row);
}

String connString = @"Data Source = C:\Database.sdf";
SqlCeBulkCopy bulkInsert = new SqlCeBulkCopy(connString);
bulkInsert.DestinationTableName = "Items";
bulkInsert.WriteToServer(table);

Inserting my (OP) 12600 rows took less than 2 seconds.

This example is bad, it's not strongly typed, but it describes how to manually create a datatable and insert it into a database using SqlCeBulkCopy library (see the accepted answer for a link).

猫腻 2024-12-06 05:54:29

您可以考虑使用我的 SqlCeBulkCopy 库来绕过 EF http://sqlcebulkcopy.codeplex.com

You could consider using my SqlCeBulkCopy library, to bypass EF http://sqlcebulkcopy.codeplex.com

无语# 2024-12-06 05:54:29

为了进一步了解 ErikEJ 的答案和您自己的示例,您可以使用列表上的 IDataReader 实现将数据流式传输到 WriteToServer 中,而不是通过 DataTable< 复制值。 /代码>。看到这个问题:

从类型化列表中获取 IDataReader

我实现了这个在工作一次时,它似乎并没有提高性能,但似乎减少了内存消耗。

To further ErikEJ's answer and your own sample, you can use the implementation of IDataReader over lists to stream the data into WriteToServer instead of duplicating the values via a DataTable. See this question:

Get an IDataReader from a typed List

I implemented this at work once, it doesn't appear to improve performance, but it appeared to reduce memory consumption.

~没有更多了~
我们使用 Cookies 和其他技术来定制您的体验包括您的登录状态等。通过阅读我们的 隐私政策 了解更多相关信息。 单击 接受 或继续使用网站,即表示您同意使用 Cookies 和您的相关数据。
原文