SQL Server - 用于归档的最佳复制模型

发布于 2024-11-19 15:43:08 字数 431 浏览 4 评论 0原文

我现在需要开始归档我正在维护的网站的数据库。基本上,每天都会生成大量事务数据,大约两周后,这些数据几乎毫无用处——出于报告目的仍然需要它,但可以安全地从生产服务器中移出。

因此,理想情况下,如果我可以这样做,我会很高兴:

  1. 设置拉复制 - 备份服务器每小时从生产服务器获取更改。
  2. 每天 - 清理生产服务器数据库 - 删除超过 2 周的条目。

关于#1 - 我很好奇从性能角度来看拉式复制是否是最有效的方法?我不想给生产服务器带来压力(或者至少不是很大的压力)...我不太关心保持数据库同步。

关于 #2 - 如何确保这些删除不会被复制 - 数据保存在存档服务器上?

生产服务器运行SQL Server 2008 Enterprise,备份服务器可以运行任何需要的服务器(当前运行SQL Server 2008 Express)

I am at the point where I need to start archiving database of website I'm maintaining. Basically, lots of transactional data is generated daily, and after approximately 2 weeks, this data is pretty much useless - it is still needed for reporting purposes, but it can be safely moved from production server.

So, ideally I would love if I could do this:

  1. Setup pull replication - backup server hourly fetches changes from production server.
  2. Daily - production server database is cleaned - entries older than 2 weeks are deleted.

On #1 - I am curious if pull replication is the most efficient way to go from performance standpoint? I don't want to put strain on production server (or at least not a big strain)... I don't care much about keeping databases in sync.

On #2 - How to make sure that these deletes are not replicated - that data is kept on archive server?

Production server runs SQL Server 2008 Enterprise, Backup server can run whatever needed (currently it runs SQL Server 2008 Express)

如果你对这篇内容有疑问,欢迎到本站社区发帖提问 参与讨论,获取更多帮助,或者扫码二维码加入 Web 技术交流群。

扫码二维码加入Web技术交流群

发布评论

需要 登录 才能够评论, 你可以免费 注册 一个本站的账号。

评论(2

醉城メ夜风 2024-11-26 15:43:08

对于#1,我会说测试一下看看。 MS 通常有几份白皮书说请求订阅的性能更高,但我不记得他们说过它的资源成本是多少。如果您担心这个问题,请设置一个远程分发服务器(远程 = 与发布服务器不是同一台服务器)

对于#2,有几种方法可以做到这一点。当您使用 sp_addarticle 设置文章时,有一个参数 @del_cmd 可以采用“none”值。这将告诉分发代理在订阅者处进行无操作删除。您还可以使用存储过程进行清除,将存储过程的执行设置为复制中的文章,然后在订阅者处设置一个不执行任何操作的存储过程。每种方法都有其优点和缺点。

For #1, I'd say test it out and see. MS typically has a couple of whitepapers saying that pull subscriptions are more performant, but I don't recall them saying what it costs in terms of resources. If you're worried about it, set up a remote distributor (remote = not the same server as the publisher)

For #2, there are a couple of ways to do it. When you set up the article with sp_addarticle, there's a parameter @del_cmd which can take a value of "none". This will tell the distribution agent to no-op deletes at the subscriber. You could also do your purging with a stored procedure, set up the execution of the stored procedure as an article in replication and then have a stored procedure at the subscriber that does nothing. Each approach has its pros and cons.

篱下浅笙歌 2024-11-26 15:43:08

我认为我们应该应用分区来进行表归档。如果您设置复制并归档超过两周的数据,则问题是在重新初始化复制时,订阅者上的所有数据将被删除并应用来自发布者的新数据。

I think that we should apply the partition for tables archiving. If you setup replicaion and archive data older than two weeks, the problem is when reinitialize replication, all data on subscribers will be delete and apply new data from publisher.

~没有更多了~
我们使用 Cookies 和其他技术来定制您的体验包括您的登录状态等。通过阅读我们的 隐私政策 了解更多相关信息。 单击 接受 或继续使用网站,即表示您同意使用 Cookies 和您的相关数据。
原文