SqlBulkCopy from a List<>

发布于 2024-09-26 22:27:12 字数 77 浏览 2 评论 0 原文

如何使用 SqlBulkCopy 从列表中进行大插入<>简单对象?

我是否实现自定义 IDataReader ?

How can I make a big insertion with SqlBulkCopy from a List<> of simple object ?

Do I implement my custom IDataReader ?

如果你对这篇内容有疑问,欢迎到本站社区发帖提问 参与讨论,获取更多帮助,或者扫码二维码加入 Web 技术交流群。

扫码二维码加入Web技术交流群

发布评论

需要 登录 才能够评论, 你可以免费 注册 一个本站的账号。

评论(5

花开雨落又逢春i 2024-10-03 22:27:12

使用 FastMember,您无需通过 DataTable 即可执行此操作(在我的测试中,性能提高了一倍多):

using(var bcp = new SqlBulkCopy(connection))
using(var reader = ObjectReader.Create(data, "Id", "Name", "Description"))
{
    bcp.DestinationTableName = "SomeTable";
    bcp.WriteToServer(reader);
}

请注意,ObjectReader 还可以使用非通用源,并且没有必要提前指定成员名称(尽管您如果您没有在 ObjectReader 本身中指定它们,则可能想要使用 SqlBulkCopyColumnMappings 方面)。

With FastMember, you can do this without ever needing to go via DataTable (which, in my tests, more-than-doubles the performance):

using(var bcp = new SqlBulkCopy(connection))
using(var reader = ObjectReader.Create(data, "Id", "Name", "Description"))
{
    bcp.DestinationTableName = "SomeTable";
    bcp.WriteToServer(reader);
}

Note that ObjectReader can also work with non-generic sources, and it is not necessary to specify the member-names in advance (although you probably want to use the ColumnMappings aspect of SqlBulkCopy if you don't specify them in the ObjectReader itself).

阪姬 2024-10-03 22:27:12

只需从对象列表中创建数据表并调用SqlBulkCopy.WriteToServer,传递数据表。

您可能会发现以下内容很有用:

为了获得 SqlBulkCopy 的最大性能,您应该设置适当的 批量大小。 10,000 似乎效果很好 - 但需要对您的数据进行分析。

使用 SqlBulkCopyOptions.TableLock

可以找到有关 SqlBulkCopy 性能的有趣且信息丰富的分析 在这里

Simply create a DataTable from your list of objects and call SqlBulkCopy.WriteToServer, passing the data table.

You might find the following useful:

For maximum performance with SqlBulkCopy, you should set an appropriate BatchSize. 10,000 seems to work well - but profile for your data.

You might also observe better results when using SqlBulkCopyOptions.TableLock.

An interesting and informative analysis of SqlBulkCopy performance can be found here.

红颜悴 2024-10-03 22:27:12

晚了,但如果您添加 Microsoft 的这个 EntityDataReader 类,就会有一个 AsDataReader() 扩展方法来完成此操作:https://github.com/matthewschrager/Repository/blob/master/Repository.EntityFramework/EntityDataReader.cs< /a>

(示例 [List].AsDataReader() 实现:)

var connStr = "";
using (var connection = new SqlConnection(connStr)) 
{
    var startTime = DateTime.Now;
    connection.Open();
    var transaction = connection.BeginTransaction();
    try
    {
        //var connStr = connection.ConnectionString;
        using (var sbCopy = new SqlBulkCopy(connection, SqlBulkCopyOptions.Default, transaction))
        {
            sbCopy.BulkCopyTimeout = 0;
            sbCopy.BatchSize = 10000;
            sbCopy.DestinationTableName = "Foobars";
            var reader = Foobars.AsDataReader();
            sbCopy.WriteToServer(reader);
        }
        transaction.Commit();
    }
    catch (Exception ex)
    {
        Console.WriteLine(ex.Message);
        transaction.Rollback();
    }
    finally
    {
        transaction.Dispose();
        connection.Close();
        var endTime = DateTime.Now;
        Console.WriteLine("Upload time elapsed: {0} seconds", (endTime - startTime).TotalSeconds);
    }
}

Late to the party, but if you add this EntityDataReader class from Microsoft, there's an AsDataReader() extension method that does exactly that: https://github.com/matthewschrager/Repository/blob/master/Repository.EntityFramework/EntityDataReader.cs

(example [List].AsDataReader() implementation:)

var connStr = "";
using (var connection = new SqlConnection(connStr)) 
{
    var startTime = DateTime.Now;
    connection.Open();
    var transaction = connection.BeginTransaction();
    try
    {
        //var connStr = connection.ConnectionString;
        using (var sbCopy = new SqlBulkCopy(connection, SqlBulkCopyOptions.Default, transaction))
        {
            sbCopy.BulkCopyTimeout = 0;
            sbCopy.BatchSize = 10000;
            sbCopy.DestinationTableName = "Foobars";
            var reader = Foobars.AsDataReader();
            sbCopy.WriteToServer(reader);
        }
        transaction.Commit();
    }
    catch (Exception ex)
    {
        Console.WriteLine(ex.Message);
        transaction.Rollback();
    }
    finally
    {
        transaction.Dispose();
        connection.Close();
        var endTime = DateTime.Now;
        Console.WriteLine("Upload time elapsed: {0} seconds", (endTime - startTime).TotalSeconds);
    }
}
煮酒 2024-10-03 22:27:12

根据您首先尝试通过调用 SqlBulkCopy 来完成的任务,使用表值参数 (TVP) 可能更有意义。使用 TVP 将使发送任何自定义类型的集合变得轻而易举。数据可以流式传输,这样您就可以避免 DataTable (很像 @Marc Gravell 的答案),并且您也可以避免 SqlBulkCopy 。当您调用存储过程将 TVP 数据传递到其中时,TVP 允许完全灵活地处理数据,一旦数据到达 SQL Server,它就会显示为一个表变量,您可以使用它执行任何操作,而不仅仅是 INSERT< /code> (SqlBulkCopy 就是这种情况)。您还可以通过 SqlDataReader 取回数据,例如新创建的 IDENTITY 值。我对此答案添加了一个示例和一些附加注释: 如何在最短的时间内插入1000万条记录?。几年前,我写了一篇关于 SQL Server Central 的文章(需要免费注册),从应用程序将数据流式传输到 SQL Server 2008,该链接的答案中也指出了这一点,提供了传入自定义类型的通用列表(从 300 万行文本文件流式传输)的工作示例。

Depending on what you are trying to accomplish by calling SqlBulkCopy in the first place, it might make more sense to use a Table-Valued Parameter (TVP). Using a TVP would make it trivial to send in a collection of any custom type. The data can be streamed in so you can avoid the DataTable (much like in @Marc Gravell's answer) and you can avoid SqlBulkCopy as well. TVP's allow for completely flexibility of how to handle the data once it gets to SQL Server as you call a Stored Procedure to pass the TVP data into and it appears as a Table Variable that you can do anything with, not just INSERT (which is the case with SqlBulkCopy). You can also get data back via a SqlDataReader, data such as newly created IDENTITY values. I added an example and some additional notes on this answer: How can I insert 10 million records in the shortest time possible?. And several years ago I wrote an article on SQL Server Central (free registration required), Streaming Data Into SQL Server 2008 From an Application, which is also noted in that linked answer, providing a working example of passing in a Generic List of a custom type, streamed in from a 3 million row text file.

只是一片海 2024-10-03 22:27:12

尝试将几百万行插入数据库时​​遇到类似的情况。

通过将列表转换为数据表,然后将表插入数据库来完成此操作。

private static DataTable CreateDataTableItem(List<ListItem> ItemList)
    {
        DataTable dt = new DataTable();
        try
        {
            dt.TableName = "PartPrice";

            foreach (PropertyInfo property in typeof(ListItem).GetProperties())
            {
                dt.Columns.Add(new DataColumn() { ColumnName = property.Name, DataType = Nullable.GetUnderlyingType(property.PropertyType) ?? property.PropertyType, AllowDBNull = true });
            }

            foreach (var item in ItemList)
            {
                DataRow newRow = dt.NewRow();
                foreach (PropertyInfo property in typeof(ListItem).GetProperties())
                {
                    newRow[property.Name] = item.GetType().GetProperty(property.Name)?.GetValue(item, null) ?? DBNull.Value;
                }
                dt.Rows.Add(newRow);
            }
            return dt;
        }
        catch (Exception ex)
        {
            Console.WriteLine(ex.ToString());
            return null;
        }
    }

public class ListItem
{
    public int Id { get; set; }
    public string Name { get; set; }
    public int? NullableId { get; set; }

}

然后使用批量插入

    private void BulkInsert(DataTable dt)
    {
        string consString = _config.GetConnectionString("yourConnectionStringkey");
        using SqlConnection connection = new SqlConnection(consString);

        connection.Open();
        using var sqlBulkCopy = new SqlBulkCopy(connection, SqlBulkCopyOptions.TableLock | SqlBulkCopyOptions.UseInternalTransaction, null);
        sqlBulkCopy.DestinationTableName = "dbo.TargetDb";
        sqlBulkCopy.ColumnMappings.Add("Id", "Id");
        sqlBulkCopy.ColumnMappings.Add("Name", "Name");
        sqlBulkCopy.ColumnMappings.Add("NullableId", "NullableId");


        sqlBulkCopy.WriteToServer(dt);
        connection.Close();
    }

You don't have to do the columnsappings 鉴于

    // Summary:
    //     Returns a collection of Microsoft.Data.SqlClient.SqlBulkCopyColumnMapping items.
    //     Column mappings define the relationships between columns in the data source and
    //     columns in the destination.
    //
    // Value:
    //     A collection of column mappings. By default, it is an empty collection.
    //
    // Remarks:
    //     If the data source and the destination table have the same number of columns,
    //     and the ordinal position of each source column within the data source matches
    //     the ordinal position of the corresponding destination column, the <xref:Microsoft.Data.SqlClient.SqlBulkCopy.ColumnMappings>
    //     collection is unnecessary. However, if the column counts differ, or the ordinal
    //     positions are not consistent, you must use <xref:Microsoft.Data.SqlClient.SqlBulkCopy.ColumnMappings>
    //     to make sure that data is copied into the correct columns. During the execution
    //     of a bulk copy operation, this collection can be accessed, but it cannot be changed.
    //     Any attempt to change it will throw an <xref:System.InvalidOperationException>.

这两行可以让您将可为空的值插入到数据表列中

 { ColumnName = property.Name, DataType = Nullable.GetUnderlyingType(property.PropertyType) ?? property.PropertyType, AllowDBNull = true }

newRow[property.Name] = item.GetType().GetProperty(property.Name)?.GetValue(item, null) ?? DBNull.Value;

Came across a similar situation trying to insert a couple million rows into the db.

Got this done by converting the List into a DataTable and then inserting the table into the database.

private static DataTable CreateDataTableItem(List<ListItem> ItemList)
    {
        DataTable dt = new DataTable();
        try
        {
            dt.TableName = "PartPrice";

            foreach (PropertyInfo property in typeof(ListItem).GetProperties())
            {
                dt.Columns.Add(new DataColumn() { ColumnName = property.Name, DataType = Nullable.GetUnderlyingType(property.PropertyType) ?? property.PropertyType, AllowDBNull = true });
            }

            foreach (var item in ItemList)
            {
                DataRow newRow = dt.NewRow();
                foreach (PropertyInfo property in typeof(ListItem).GetProperties())
                {
                    newRow[property.Name] = item.GetType().GetProperty(property.Name)?.GetValue(item, null) ?? DBNull.Value;
                }
                dt.Rows.Add(newRow);
            }
            return dt;
        }
        catch (Exception ex)
        {
            Console.WriteLine(ex.ToString());
            return null;
        }
    }

public class ListItem
{
    public int Id { get; set; }
    public string Name { get; set; }
    public int? NullableId { get; set; }

}

And then bulk insert using

    private void BulkInsert(DataTable dt)
    {
        string consString = _config.GetConnectionString("yourConnectionStringkey");
        using SqlConnection connection = new SqlConnection(consString);

        connection.Open();
        using var sqlBulkCopy = new SqlBulkCopy(connection, SqlBulkCopyOptions.TableLock | SqlBulkCopyOptions.UseInternalTransaction, null);
        sqlBulkCopy.DestinationTableName = "dbo.TargetDb";
        sqlBulkCopy.ColumnMappings.Add("Id", "Id");
        sqlBulkCopy.ColumnMappings.Add("Name", "Name");
        sqlBulkCopy.ColumnMappings.Add("NullableId", "NullableId");


        sqlBulkCopy.WriteToServer(dt);
        connection.Close();
    }

You dont have to do the column mappings given

    // Summary:
    //     Returns a collection of Microsoft.Data.SqlClient.SqlBulkCopyColumnMapping items.
    //     Column mappings define the relationships between columns in the data source and
    //     columns in the destination.
    //
    // Value:
    //     A collection of column mappings. By default, it is an empty collection.
    //
    // Remarks:
    //     If the data source and the destination table have the same number of columns,
    //     and the ordinal position of each source column within the data source matches
    //     the ordinal position of the corresponding destination column, the <xref:Microsoft.Data.SqlClient.SqlBulkCopy.ColumnMappings>
    //     collection is unnecessary. However, if the column counts differ, or the ordinal
    //     positions are not consistent, you must use <xref:Microsoft.Data.SqlClient.SqlBulkCopy.ColumnMappings>
    //     to make sure that data is copied into the correct columns. During the execution
    //     of a bulk copy operation, this collection can be accessed, but it cannot be changed.
    //     Any attempt to change it will throw an <xref:System.InvalidOperationException>.

These 2 lines lets you insert nullable values into the datatable column

 { ColumnName = property.Name, DataType = Nullable.GetUnderlyingType(property.PropertyType) ?? property.PropertyType, AllowDBNull = true }

newRow[property.Name] = item.GetType().GetProperty(property.Name)?.GetValue(item, null) ?? DBNull.Value;
~没有更多了~
我们使用 Cookies 和其他技术来定制您的体验包括您的登录状态等。通过阅读我们的 隐私政策 了解更多相关信息。 单击 接受 或继续使用网站,即表示您同意使用 Cookies 和您的相关数据。
原文