数百万行的更新查询填满事务日志

发布于 2024-09-06 09:42:03 字数 788 浏览 5 评论 0原文

作为下一个版本的一部分,我需要更新数百万行,但这样做会填满事务日志并失败。我有一些想法,但我不是 SQL 专家,所以我确信会有一些我不知道的陷阱。

相关要点:

  1. 我需要将脚本交给运营团队,因此需要一种无需手动干预的 T-SQL 方法。
  2. 显然,事务日志每 15 分钟就会回收一次。 (我考虑过在 catch 块中使用 WAITFOR DELAY '00:15:00' 编写一个带有 try-catch 的循环,如下所示)
  3. (编辑) 我可以除了数据之外不要修改任何内容。
  4. (编辑)这是一个简单的更新,将外键列更改为不同的现有键。

谢谢,

菲尔

DECLARE
    @AffectedRows int

SET @AffectedRows = 0

WHILE @AffectedRows < @RowsToUpdate
BEGIN
    BEGIN TRY
        BEGIN TRAN
        -- Do some updates  
        SET @AffectedRows = @AffectedRows + @@RowCount
        COMMIT TRAN
    END TRY
    BEGIN CATCH
        PRINT ERROR_MESSAGE()
        WAITFOR DELAY '00:15:00'
    END CATCH
END

PRINT @AffectedRows

I need to update millions of rows as part of my next release, but doing so fills the transaction log and fails. I have a few ideas but I'm not a SQL expert so I'm sure there will be gotchas that I'm not aware of.

Pertinent points:

  1. I need to hand a script over to the operations team so need a T-SQL method with no manual intervention.
  2. Apparently the transaction log gets recycled every 15 minutes. (I've thought about writing a loop with a try-catch with WAITFOR DELAY '00:15:00' in the catch block like below)
  3. (EDIT) I can't modify anything except the data.
  4. (EDIT) It's a simple update changing a foreign key column to a different existing key.

Thanks,

Phil

DECLARE
    @AffectedRows int

SET @AffectedRows = 0

WHILE @AffectedRows < @RowsToUpdate
BEGIN
    BEGIN TRY
        BEGIN TRAN
        -- Do some updates  
        SET @AffectedRows = @AffectedRows + @@RowCount
        COMMIT TRAN
    END TRY
    BEGIN CATCH
        PRINT ERROR_MESSAGE()
        WAITFOR DELAY '00:15:00'
    END CATCH
END

PRINT @AffectedRows

如果你对这篇内容有疑问,欢迎到本站社区发帖提问 参与讨论,获取更多帮助,或者扫码二维码加入 Web 技术交流群。

扫码二维码加入Web技术交流群

发布评论

需要 登录 才能够评论, 你可以免费 注册 一个本站的账号。

评论(3

梦明 2024-09-13 09:42:03

最后,我已经写的例子效果最好; catch 中捕获事务日志已满错误,并且 15 分钟足以回收日志。

DECLARE 
    @AffectedRows int 

SET @AffectedRows = 0 

WHILE @AffectedRows < @RowsToUpdate 
BEGIN 
    BEGIN TRY 
        BEGIN TRAN 
        -- Do some updates   
        SET @AffectedRows = @AffectedRows + @@RowCount 
        COMMIT TRAN 
    END TRY 
    BEGIN CATCH 
        PRINT ERROR_MESSAGE() 
        WAITFOR DELAY '00:15:00' 
    END CATCH 
END 

PRINT @AffectedRows

In the end the example I had already written worked best; a transaction log full error gets caught in the catch and 15 minutes is long enough for the log to be recycled.

DECLARE 
    @AffectedRows int 

SET @AffectedRows = 0 

WHILE @AffectedRows < @RowsToUpdate 
BEGIN 
    BEGIN TRY 
        BEGIN TRAN 
        -- Do some updates   
        SET @AffectedRows = @AffectedRows + @@RowCount 
        COMMIT TRAN 
    END TRY 
    BEGIN CATCH 
        PRINT ERROR_MESSAGE() 
        WAITFOR DELAY '00:15:00' 
    END CATCH 
END 

PRINT @AffectedRows
帅气尐潴 2024-09-13 09:42:03

几点/想法:

  1. 您可以将事务日志扩展到您想要的任何大小,这样它就不会填满它。
  2. 如果您的事务日志增长太多,您可以随时备份数据库并截断日志。
  3. 您可以批量处理数据(一次处理一百万个)
  4. 您可以将数据复制到工作表,然后在处理完成后对其进行 sp_rename。

A few points / ideas:

  1. You can expand your transaction log to whatever size you want so it does not fill it.
  2. If your transaction log grows too much you can always backup your DB and truncate the log.
  3. You can work through the data in batches (do a million at a time)
  4. You can copy the data to a working table and then sp_rename it in when the processing is done.
别靠近我心 2024-09-13 09:42:03

您正在重新发明蚕食删除/更新:)

看看这种方法,您可以做比单行更大的块:

http://www.sqlservervideos.com/video/nibbling-deletes/

http://sqladvice.com/blogs/repeatableread/archive/2005/09/20/12795.aspx

You're reinventing nibbling deletes/updates :)

Take a look at this approach, you can do bigger blocks than a single row:

http://www.sqlservervideos.com/video/nibbling-deletes/

http://sqladvice.com/blogs/repeatableread/archive/2005/09/20/12795.aspx

~没有更多了~
我们使用 Cookies 和其他技术来定制您的体验包括您的登录状态等。通过阅读我们的 隐私政策 了解更多相关信息。 单击 接受 或继续使用网站,即表示您同意使用 Cookies 和您的相关数据。
原文