插入触发器后 - SQL Server 2008

发布于 2024-08-18 07:39:26 字数 605 浏览 1 评论 0原文

我有来自 datastage 的数据,这些数据被放入 SQL Server 2008 数据库的表中:stg_table_outside_data。外部来源每天早上都会将数据放入该表中。我想将数据从 stg_table_outside_data 移动到 table_outside_data,我在其中保存了多天的数据。

我创建了一个存储过程,将 stg_table_outside_Data 中的数据插入到 table_outside_data 中,然后截断 stg_table_outside_Data。外部数据阶段过程不在我的控制范围内,因此我必须在 SQL Server 2008 中完成这一切。我原本计划使用一个简单的 after insert 语句,但数据阶段在每 100,000 行后执行一次提交。触发器将在第一次提交后运行,并导致数据阶段进程出现死锁错误。

有没有办法设置插入后等待 30 分钟,然后确保在该时间范围内没有新的提交?对于我的问题有更好的解决方案吗?目标是将数据从暂存表中取出并放入工作表中,并且不重复,然后截断暂存表以供第二天早上的负载使用。

感谢您的时间和帮助。

I have data coming in from datastage that is being put in our SQL Server 2008 database in a table: stg_table_outside_data. The ourside source is putting the data into that table every morning. I want to move the data from stg_table_outside_data to table_outside_data where I keep multiple days worth of data.

I created a stored procedure that inserts the data from stg_table_outside_Data into table_outside_data and then truncates stg_table_outside_Data. The outside datastage process is outside of my control, so I have to do this all within SQL Server 2008. I had originally planned on using a simple after insert statement, but datastage is doing a commit after every 100,000 rows. The trigger would run after the first commit and cause a deadlock error to come up for the datastage process.

Is there a way to set up an after insert to wait 30 minutes then make sure there wasn't a new commit within that time frame? Is there a better solution to my problem? The goal is to get the data out of the staging table and into the working table without duplications and then truncate the staging table for the next morning's load.

I appreciate your time and help.

如果你对这篇内容有疑问,欢迎到本站社区发帖提问 参与讨论,获取更多帮助,或者扫码二维码加入 Web 技术交流群。

扫码二维码加入Web技术交流群

发布评论

需要 登录 才能够评论, 你可以免费 注册 一个本站的账号。

评论(4

请恋爱 2024-08-25 07:39:26

实现此目的的一种方法是利用 SQL Server 2008 中新的 MERGE 语句(请参阅 MSDN 文档此博客文章),并将其安排为每 30 分钟左右执行一次 SQL 作业。

MERGE 语句允许您轻松定义操作(INSERT、UPDATE、DELETE 或根本不定义任何操作),具体取决于源数据(您的临时表)和目标数据(您的“真实”表)是否符合某些条件,或者不是。

所以在你的情况下,它会是这样的:

MERGE table_outside_data AS target
USING stg_table_outside_data AS source 
ON (target.ProductID = source.ProductID)   -- whatever join makes sense for you

WHEN NOT MATCHED THEN
    INSERT VALUES(.......)

WHEN MATCHED THEN 
    -- do nothing

One way you could do this is take advantage of the new MERGE statement in SQL Server 2008 (see the MSDN docs and this blog post) and just schedule that as a SQL job every 30 minutes or so.

The MERGE statement allows you to easily just define operations (INSERT, UPDATE, DELETE, or nothing at all) depending on whether the source data (your staging table) and the target data (your "real" table) match on some criteria, or not.

So in your case, it would be something like:

MERGE table_outside_data AS target
USING stg_table_outside_data AS source 
ON (target.ProductID = source.ProductID)   -- whatever join makes sense for you

WHEN NOT MATCHED THEN
    INSERT VALUES(.......)

WHEN MATCHED THEN 
    -- do nothing
半窗疏影 2024-08-25 07:39:26

您不应该使用触发器来执行此操作,您应该使用计划的作业。

You shouldn't be using a trigger to do this, you should use a scheduled job.

可是我不能没有你 2024-08-25 07:39:26

也许构建一个过程,每天将所有数据从 stg_table_outside_Data 移动到 table_outside_data 一次,或者使用作业调度程序。

maybe building a procedure that moves all data from stg_table_outside_Data to table_outside_data once a day, or by using job scheduler.

不知所踪 2024-08-25 07:39:26

对触发器进行行计数,如果计数小于 100,000,则不执行任何操作。否则,运行您的进程。

Do a row count on the trigger, if the count is less than 100,000 do nothing. Otherwise, run your process.

~没有更多了~
我们使用 Cookies 和其他技术来定制您的体验包括您的登录状态等。通过阅读我们的 隐私政策 了解更多相关信息。 单击 接受 或继续使用网站,即表示您同意使用 Cookies 和您的相关数据。
原文