使用游标将数据从一个表复制到另一个表

发布于 2024-12-17 21:31:20 字数 261 浏览 1 评论 0原文

我有一个非常大的表(大约 1300 万行),名为 Book。我想在 Book 表的一列中设置主键,但由于它是一个非常大的表,服务器在更新过程中崩溃。它内存不足。所以我创建了一个 BookTemp 表,我在这个空表中设置了所有主键,然后我想将 Book 中的数据插入到 BookTemp 表中。但如果我立即这样做,内存就会再次耗尽。所以我想使用游标来每次插入 10,000 行,然后擦除 RAM,但我对游标真的很陌生,所以在这一点上我需要你的帮助。

我使用 SQL Server 2008 R2

i have a really large table (about 13 million rows) called Book. I want to set the primary key in a column of the Book table but as it is a very large table the server crashes during updating. It runs out of memory. So i created a BookTemp table, i set all primary keys in this empty table and then i want to insert the data from Book to BookTemp table. But if i do it at once the memory again runs out. So i thought to use cursors in order to insert 10,000 rows each time and then erase RAM but i am really new to cursors so in that point i would like your help.

I use SQL Server 2008 R2

如果你对这篇内容有疑问,欢迎到本站社区发帖提问 参与讨论,获取更多帮助,或者扫码二维码加入 Web 技术交流群。

扫码二维码加入Web技术交流群

发布评论

需要 登录 才能够评论, 你可以免费 注册 一个本站的账号。

评论(2

萌逼全场 2024-12-24 21:31:20

我建议使用 while 循环来迭代临时表。示例

或者你可以修改这个:

DECLARE @counter AS INT = 0;
DECLARE @batch_size AS INT = 10000;
WHILE (@counter < (SELECT MAX(id) FROM temp_table))
BEGIN
  INSERT INTO the_table
  SELECT * FROM temp_table
  WHERE id BETWEEN @counter AND (@counter + @batch_size - 1);

  SET @counter = @counter + @batch_size;
END

I would suggest using a while loop to iterate over your temporary table. The example here should get you started.

Or you could just modify this:

DECLARE @counter AS INT = 0;
DECLARE @batch_size AS INT = 10000;
WHILE (@counter < (SELECT MAX(id) FROM temp_table))
BEGIN
  INSERT INTO the_table
  SELECT * FROM temp_table
  WHERE id BETWEEN @counter AND (@counter + @batch_size - 1);

  SET @counter = @counter + @batch_size;
END
我很坚强 2024-12-24 21:31:20

执行以下三个命令时,将通过清理 SQL Server 的缓存来释放 SQL Server 的内存。

DBCC FREESYSTEMCACHE
DBCC FREESESSIONCACHE
DBCC FREEPROCCACHE 

但是,它可以用于正在进行的操作,因此请参见下面的查询,每插入 10000 条后,通过执行上述 DBCC 命令就会清除内存。

DECLARE @counter INT = 1
DECLARE cur_Data_Transfer CURSOR FOR  -- Cursor declared
SELECT column1, column2  -- select desired columns
FROM Book 

OPEN cur_Data_Transfer  --Opening Cursor
FETCH NEXT FROM cur_Data_Transfer INTO @column1, @column2  --Put values to variable

WHILE @@FETCH_STATUS = 0  -- Faching is success
BEGIN  
       INSERT INTO BookTemp (column1, column2) -- Inserting to temptable
            VALUES(@column1, @column2)
       IF @counter = 10000
       BEGIN    
        DBCC FREESYSTEMCACHE -- Clear System Cache
        DBCC FREEPROCCACHE   -- Clear Proc Cache
        SET @counter = 0     -- Restarting counter
       END
       FETCH NEXT FROM cur_Data_Transfer INTO @column1, @column2
       SET @counter = @counter + 1
END  
CLOSE cur_Data_Transfer  -- Closing cursor
DEALLOCATE cur_Data_Transfer -- De-allocating 

When executed following three commands will free up memory for SQL Server by cleaning up its cache.

DBCC FREESYSTEMCACHE
DBCC FREESESSIONCACHE
DBCC FREEPROCCACHE 

However, it can be used in a on going operation, so please see blow query, after inserting each 10000 the memory is cleared by executing the above DBCC command.

DECLARE @counter INT = 1
DECLARE cur_Data_Transfer CURSOR FOR  -- Cursor declared
SELECT column1, column2  -- select desired columns
FROM Book 

OPEN cur_Data_Transfer  --Opening Cursor
FETCH NEXT FROM cur_Data_Transfer INTO @column1, @column2  --Put values to variable

WHILE @@FETCH_STATUS = 0  -- Faching is success
BEGIN  
       INSERT INTO BookTemp (column1, column2) -- Inserting to temptable
            VALUES(@column1, @column2)
       IF @counter = 10000
       BEGIN    
        DBCC FREESYSTEMCACHE -- Clear System Cache
        DBCC FREEPROCCACHE   -- Clear Proc Cache
        SET @counter = 0     -- Restarting counter
       END
       FETCH NEXT FROM cur_Data_Transfer INTO @column1, @column2
       SET @counter = @counter + 1
END  
CLOSE cur_Data_Transfer  -- Closing cursor
DEALLOCATE cur_Data_Transfer -- De-allocating 
~没有更多了~
我们使用 Cookies 和其他技术来定制您的体验包括您的登录状态等。通过阅读我们的 隐私政策 了解更多相关信息。 单击 接受 或继续使用网站,即表示您同意使用 Cookies 和您的相关数据。
原文