plsql更新百万条记录表的性能检查

发布于 2025-01-12 20:03:04 字数 1991 浏览 3 评论 0原文

下面的块尝试使用表 t2 中找到的数据更新大表 t1。当我更新具有 500 条记录的注释代码但需要 30 分钟才能更新超过 1000 条记录时,这似乎很好。我尝试了批量收集更新和注释代码索引,没有太多时间差异。

DECLARE
  
  
   lv_row_count            NUMBER(9) := 0;
   lv_total_count          NUMBER(9) := 0;  
   lv_commit_cnt           SIMPLE_INTEGER:=0;
   
BEGIN
         FOR rec in 
         (SELECT 
           a.t1_id,
           a.t1_orig_code,
           t2_orig_code,
           a.t1_comment_code,     
           t2_code,
           a.t1_restrict_update_ind,
           t2_restrict_update_ind,
           a.t1_data_origin,
           t2_data_origin,
           a.t1_purge_ind,
           t2_purge_ind,
           a.t1_created_date,
           a.rowid
     FROM t1 a     
     JOIN t2 ON t2_code = a.t1_comment_code
    WHERE  a.t1_comment_code in ('A','B','C','C1','D3')
      AND ( a.t1_orig_code != t2_orig_code  OR  a.t1_restrict_update_ind !=t2_restrict_update_ind 
           OR  a.t1_data_origin != t2_data_origin OR  a.t1_purge_ind != t2_purge_ind) 
  )
          LOOP
            lv_total_count := lv_total_count + 1;           
          
                UPDATE t1 
                   SET t1_ORIG_CODE= rec.t2_orig_code  
                       t1_RESTRICT_UPDATE_IND = 'Y',,
                       t1_DATA_ORIGIN = rec.t2_data_origin,
                       t1_PURGE_IND =rec.t2_purge_ind                  
                 WHERE t1.rowid =rec.rowid ;
                 
                   lv_commit_cnt:=lv_commit_cnt+1;
                    IF MOD(lv_commit_cnt,lv_limit)=0 THEN
                       lv_commit_cnt:=0;
                       COMMIT;
                     END IF;
                                          dbms_output.put_line('a.t1_pidm -'||rec.t1_pidm ||
          'a.t1_orig_code -'||rec.t1_orig_code ||'Updated');         


           
           END LOOP;
           COMMIT;
           dbms_output.put_line('Total_count- '||lv_total_count);
           -- dbms_output.put_line('No record');
        
END;

感谢对此的投入。

The below block tries to update a large table t1 with data found in table t2.It seems fine when i update for comment code that has 500 records but takes 30 minutes to update more than 1000 records. I tried the bulk collect update and index on comment code there is not much of time difference.

DECLARE
  
  
   lv_row_count            NUMBER(9) := 0;
   lv_total_count          NUMBER(9) := 0;  
   lv_commit_cnt           SIMPLE_INTEGER:=0;
   
BEGIN
         FOR rec in 
         (SELECT 
           a.t1_id,
           a.t1_orig_code,
           t2_orig_code,
           a.t1_comment_code,     
           t2_code,
           a.t1_restrict_update_ind,
           t2_restrict_update_ind,
           a.t1_data_origin,
           t2_data_origin,
           a.t1_purge_ind,
           t2_purge_ind,
           a.t1_created_date,
           a.rowid
     FROM t1 a     
     JOIN t2 ON t2_code = a.t1_comment_code
    WHERE  a.t1_comment_code in ('A','B','C','C1','D3')
      AND ( a.t1_orig_code != t2_orig_code  OR  a.t1_restrict_update_ind !=t2_restrict_update_ind 
           OR  a.t1_data_origin != t2_data_origin OR  a.t1_purge_ind != t2_purge_ind) 
  )
          LOOP
            lv_total_count := lv_total_count + 1;           
          
                UPDATE t1 
                   SET t1_ORIG_CODE= rec.t2_orig_code  
                       t1_RESTRICT_UPDATE_IND = 'Y',,
                       t1_DATA_ORIGIN = rec.t2_data_origin,
                       t1_PURGE_IND =rec.t2_purge_ind                  
                 WHERE t1.rowid =rec.rowid ;
                 
                   lv_commit_cnt:=lv_commit_cnt+1;
                    IF MOD(lv_commit_cnt,lv_limit)=0 THEN
                       lv_commit_cnt:=0;
                       COMMIT;
                     END IF;
                                          dbms_output.put_line('a.t1_pidm -'||rec.t1_pidm ||
          'a.t1_orig_code -'||rec.t1_orig_code ||'Updated');         


           
           END LOOP;
           COMMIT;
           dbms_output.put_line('Total_count- '||lv_total_count);
           -- dbms_output.put_line('No record');
        
END;

Appreciate inputs on this .

如果你对这篇内容有疑问,欢迎到本站社区发帖提问 参与讨论,获取更多帮助,或者扫码二维码加入 Web 技术交流群。

扫码二维码加入Web技术交流群

发布评论

需要 登录 才能够评论, 你可以免费 注册 一个本站的账号。

评论(1

如梦亦如幻 2025-01-19 20:03:04

毫不奇怪,这需要很长时间。逐行处理是很慢的。

那么合并怎么样?

merge into t1
  using t2
  on (t1.t1_comment_code = t2.t2_code)
when matched then update set
  t1.t1_orig_code = t2.t2_orig_code,
  t1.t1_restrict_update_ind = 'Y',
  t1.t1_data_origin = t2.t2_data_origin,
  t1.t1_purge_ind = t2.t2_purge_ind
where t1.t1_comment_code in ('A', 'B', 'C', 'C1', 'D3');

就这样;没有 PL/SQL,没有循环,循环中间没有提交……什么都没有。只需合并即可。

No surprise it takes ages; row-by-row processing is slow-by-slow.

How about merge, instead?

merge into t1
  using t2
  on (t1.t1_comment_code = t2.t2_code)
when matched then update set
  t1.t1_orig_code = t2.t2_orig_code,
  t1.t1_restrict_update_ind = 'Y',
  t1.t1_data_origin = t2.t2_data_origin,
  t1.t1_purge_ind = t2.t2_purge_ind
where t1.t1_comment_code in ('A', 'B', 'C', 'C1', 'D3');

Just like that; no PL/SQL, no loop, no commit in the middle of the loop ... nothing. Just merge.

~没有更多了~
我们使用 Cookies 和其他技术来定制您的体验包括您的登录状态等。通过阅读我们的 隐私政策 了解更多相关信息。 单击 接受 或继续使用网站,即表示您同意使用 Cookies 和您的相关数据。
原文