让生产数据库在本地使用

发布于 2024-08-10 17:55:06 字数 902 浏览 1 评论 0原文

我已经接近对过去几个月我一直在开发的客户系统说“让我们上线”的阶段,基本上是一些自治服务,包括面向公众的网站、内联网网站、遗留系统的每小时导出程序OLTP DB 基于修改后的标志/触发器等,以及组成系统的其他一些服务,每个服务都有自己的专用数据库等,通过消息(NServiceBus)相互通信。

当我开始时,我尝试保留所有内容的本地复制,但事实证明它越来越困难,并且反思可能是过去几周的主要摩擦点,我喜欢定期保持最新状态,因为遗留数据库不断增长并导致数百每日活动。具有高延迟&平庸的带宽(在我自己和客户站点之间,我在东南亚,无论如何,那里的带宽通常都很糟糕)也是 RDP、SQL 工具、远程连接字符串等的一个问题。跟踪集成错误并了解它们在反馈期间呈现的场景/集成/质量检查也很困难,因为我的数据不能反映客户数据库的当前状态(客户的员工一直在工作并不断发展数据),并且意味着再次休息、喝咖啡和长时间同步。理想的做法是在本地完成所有工作,然后在最后进行部署,但我必须增量交付部件(以进行检查),并且某些部件甚至正在使用(尽管并不重要),因此需要对使用中的错误进行修复快速,而且由于它是一家小公司,增量反馈,它有助于清除一些更模糊的要求(诅咒我)。

我认为在环境之间(他们的数据库到我的)每天进行两次同步会很好,我对除了旧版 SQL Server 数据库之外的所有内容都有一定的设计控制权。

SO 用户最好的选择是什么?

我正在考虑在我的开发机上设置一个 Windows 2003 轻型虚拟机。并在此安装相同的客户端站点设置(但显然没有分布在多个服务器上)。然后为了同步数据库,我正在考虑 SQL Server 复制?或者批处理脚本?或者有没有更好的工具——压缩速度快且效果好的工具?我不希望我的更改返回到生产环境(我有一个单独的 CI 和部署程序),我只是希望(我想我想要..告诉我是否有更好的主意)我的数据库每晚或每晚刷新两次一天(如果带宽允许的话,也许我正在吃午饭)。

大家都是如何看待这个问题的呢?

I'm nearing the stage of saying "lets go live" of a client's system I've been working on for the past few months, basically a few autonomous services including a public facing website, an intranet website, an hourly exporter from a legacy OLTP DB based on modified flags/triggers etc.. and few others services that compose the system, with each their own specialized database etc.. , communicating with each other via messages (NServiceBus).

When I started I tried to keep a local replication of everything but its proving more and more difficult and on reflection probably a major friction point of the past few weeks, I like to keep regularly up to date as the legacy database is growing and causing hundreds events daily. Having high latency & mediocre bandwidth (between myself and the client's site, I'm in SE Asia where bandwidth is generally crap anyway) is also an issue for RDP, SQL tools, remote connection strings etc.. Tracking down integration bugs and understanding scenarios they present during feedback/integration/QA is also difficult as my data doesn't reflect the current state of the client's DB (the clients' staff have been working and evolving the data their end) and means another break, coffee and lengthy sync again. It would be ideal to do it all locally and then deploy at the end but I have to deliver parts incrementally (to get the check), and some parts are even in use (although not critical) so bug fixing on in use pats needs turn around quick, and with it being a small company, incremental feedback, it helps flush out some of the more vague requirements along the way (curse me).

I was thinking it would be good to have twice daily sync between the environments (their DBs to mine), I somewhat have design control over everything apart from the legacy SQL server database.

What are the best options SO users?

I was thinking of setting up a Windows 2003 light VM on my dev box. And in this install the same setup of the client sites (but not spread across multiple servers obviously). And then for syncing the databases I was thinking about SQL Server replication? or batch scripts? Or is there any better tools - ones that are fast and good compressions? I don't want my changes to go back to production (I have a separate CI & deployment procedure), I just want (I think I want.. tell me if better idea) my databases to be refreshed every night or twice a day (maybe whilst I'm at lunch bandwidth permitting).

How does everyone approach this?

如果你对这篇内容有疑问,欢迎到本站社区发帖提问 参与讨论,获取更多帮助,或者扫码二维码加入 Web 技术交流群。

扫码二维码加入Web技术交流群

发布评论

需要 登录 才能够评论, 你可以免费 注册 一个本站的账号。

评论(1

浅黛梨妆こ 2024-08-17 17:55:06

我建议采用两种方法来执行此操作:

  • 快照复制
  • 备份事务日志并手动(或批量)应用它

快照复制可能很难发挥作用,但即使在脱机情况下(例如将快照物理传送到另一个位置)也是可能的。

事务日志方法可以用作标准备份过程的一部分。即:每周两次完整备份,并更定期地进行事务日志备份。

请记住,最佳实践是在测试环境中使用数据之前先清理数据。至少这应该改变所有个人数据,特别是电子邮件地址、密码和任何其他可能导致与数据库中的用户进行联系的自动化过程的方法。

I would recommend two ways to do this:

  • Snapshot Replication
  • Backing up the transaction log and manually (or batch) applying it

Snapshot replication can be difficult to get working, but it is possible even in offline situations such as physcially carrying snapshots to another location.

The transaction log method can be used as part of your standard backup procedures. ie: A full backup twice a week with transaction log backups more regularly.

Remember that best practice is to cleanse the data before using it in a test environment. At the very least this should be changing all personal data, especially email addresses, passwords and any other method which could result in some automated process making contact with the user in your database.

~没有更多了~
我们使用 Cookies 和其他技术来定制您的体验包括您的登录状态等。通过阅读我们的 隐私政策 了解更多相关信息。 单击 接受 或继续使用网站,即表示您同意使用 Cookies 和您的相关数据。
原文