回归测试和部署策略
我想要一些有关部署策略的建议。如果开发团队创建了一个广泛的框架,并且许多(20-30)个应用程序使用它,并且企业希望应用程序至少每 30 天更新一次,那么最佳部署策略是什么?
我问的原因是,如果 90% 的应用程序不发生变化,那么使用每月部署更改的敏捷方法似乎会造成很多浪费(和风险)。我的意思是,框架可能会在一个月内发生变化,一些应用程序也会发生变化。由于框架发生了变化,所有应用程序都应该进行回归测试。比如说,如果其中 10 个应用程序在一年中根本没有发生变化,那么这 10 个应用程序每月都会进行回归测试,此时它们没有任何功能更改或修补程序。必须对它们进行测试只是因为业务每月都会滚动更新。
所涉及的风险......如果部署了一个关键任务应用程序,需要花费几周的时间和多个部门来测试,那么期望不断地对该应用程序进行回归测试是否现实?
一种选择是使任何框架更新向后兼容。虽然这意味着应用程序不需要更改其代码,但它们仍然需要进行测试,因为底层框架发生了变化。而且涉及的风险也很大;不断变化的框架(以及部署该框架)意味着关键任务应用程序永远无法长期使用相同的代码库。
这些应用程序共享相同的数据库,因此需要不断的测试。我知道 TDD 和自动化测试,但目前还不存在。
有什么建议吗?
I'd like some advice on a deployment strategy. If a development team creates an extensive framework, and many (20-30) applications consume it, and the business would like application updates at least every 30 days, what is the best deployment strategy?
The reason I ask is that there seems to be a lot of waste (and risk) in using an agile approach of deploying changes monthly, if 90% of the applications don't change. What I mean by this is that the framework can change during the month, and so can a few applications. Because the framework changed, all applications should be regression-tested. If, say, 10 of the applications don't change at all during the year, then those 10 applications are regression-tested EVERY MONTH, when they didn't have any feature changes or hot fixes. They had to be tested simply because the business is rolling updates every month.
And the risk that is involved... if a mission-critical application is deployed, that takes a few weeks, and multiple departments, to test, is it realistic to expect to have to constantly regression-test this application?
One option is to make any framework updates backward-compatible. While this would mean that applications don't need to change their code, they would still need to be tested because the underlying framework changed. And the risk involved is great; a constantly changing framework (and deploying this framework) means the mission-critical app can never just enjoy the same code base for a long time.
These applications share the same database, hence the need for the constant testing. I'm aware of TDD and automated tests, but that doesn't exist at the moment.
Any advice?
如果你对这篇内容有疑问,欢迎到本站社区发帖提问 参与讨论,获取更多帮助,或者扫码二维码加入 Web 技术交流群。
绑定邮箱获取回复消息
由于您还没有绑定你的真实邮箱,如果其他用户或者作者回复了您的评论,将不能在第一时间通知您!
发布评论
评论(4)
框架背后的想法是它应该是“缓慢移动的代码”。您不应该像更改框架所支持的应用程序那样频繁地更改框架。尝试以较慢的开发周期获取框架:也许不超过每三到六个月发布一次。
我的猜测是,您仍在制定此框架中的一些架构决策。如果您认为框架更改确实需要如此动态,请找出框架的哪些部分经常更改,并尝试将这些部分重构为需要它们的应用程序。
敏捷并不一定意味着对一切进行无限的改变。您的架构师可以对框架的构成进行限制,并防止人们轻易地针对可能的应用程序快捷方式对其进行调整。可能需要几次迭代才能稳定下来,但之后应该会更加稳定。
The idea behind a framework is that it's supposed to be the "slow moving code". You shouldn't be changing the framework as frequently as the applications it supports. Try getting the framework on a slower development cycle: perhaps a release no more often than every three or six months.
My guess is that you're still working out some of the architectural decisions in this framework. If you think the framework changes really need to be that dynamic, find out what parts of the framework are being changed so often, and try to refactor those out to the applications that need them.
Agile doesn't have to mean unlimited changes to everything. Your architect could place boundaries on what constitutes the framework, and keep people from tweaking it so readily for what are likely application shortcuts. It may take a few iterations to get it settled down, but after that it should be more stable.
除非您有(单元)测试覆盖率,否则我不会将其称为敏捷方法。敏捷的关键原则之一是拥有强大的单元测试,为频繁重构和新功能开发提供安全网。您的情况存在很大的风险。每月部署 20 到 30 个应用程序,其中 1) 大多数应用程序不会为其用户添加任何新的业务价值; 2)在我的书中,没有适当的测试就不是一个好主意。我是敏捷的坚定信徒。但你不能只挑选其中方便的部分。
如果业务应用程序没有改变,我不会仅仅为了在新框架中编译而发布它。想象一下,每次框架发生变化时,每个 .NET 应用程序都需要重新发布。阅读您的问题,我想知道通用数据库是否正在推动对此的需求。如果您的框架隔离架构,并且您发现只要架构发生更改就需要重建应用程序,那么您需要首先解决该问题。查看 Scott Ambler 的重构数据库,了解一些技巧。
另一方面,集成测试和单元测试之间存在很大差异。您的回归测试是集成测试。在这个级别实现自动化非常困难。我认为测试中发生的突破都是关于编写高度可测试的代码,使单元测试越来越多的代码库成为可能。
I wouldn't call it an Agile approach unless you have (unit) test coverage. One of the key tenets of Agile is that you have robust unit tests that provide a safety net for frequent refactoring and new feature development. There is a lot of risk in your scenario. Deploying twenty to thirty applications a month when 1) most of them don't add any new business value to their users; and 2) there are no tests in place would not qualify as a good idea in my book. And I'm a strong believer in Agile. But you can't pick and choose only the parts of it that are convenient.
If the business application has not changed, I wouldn't release it just to compile in a new framework. Imagine every .NET application needing to be re-released every time the framework changed. Reading into your question, I wonder if the common database is driving the need for this. If your framework is isolating the schema and you're finding you need to rebuild apps whenever the schema changes, then you need to tackle that problem first. Check out Refactoring Databases, by Scott Ambler for some tips.
As another aside, there's a big difference between integration test and unit tests. Your regression tests are integration tests. It's very difficult to automate at that level. I think the breakthroughs that are happening in testing are all about writing highly testable code that makes unit testing more and more of the code base possible.
以下是我能想到的一些提示:
1. 将框架分解为独立的部分,这样改变一个部分只需要运行一小部分测试用例。
2. 采用测试用例优先级划分技术。也就是说,您仅重新运行某些策略选择的应用程序的一小部分测试池。附加分支和 ART 通常比其他分支具有更好的性能。他们需要知道每个测试用例的分支覆盖率信息。
3. 框架更新频率较低。如果应用程序不需要更改,则意味着可以不更改它。所以我想这些应用程序使用旧版本的框架是可以的。您可以每 3 个月更新一次这些应用程序的框架。
Here are some tips I can think of:
1. break the framework into independent parts, so that changing one part requires only running a small portion of test cases.
2. Employ a test case prioritizaion technique. That is, you only rerun a small portion of the test pools of the applications selected by some strategy. Additional branch and ART have better performance than others usually. They require to know the branch coverage information of each test case.
3. Update the framework less frequently. If an application doesn't need change, it means its ok not to change it. So I guess its ok for these applications to use the old version of the framework. You can update the framework for these applications say every 3 months.
回归测试是一种生活方式。在发布每个应用程序之前,您需要对其进行回归测试。但是,由于时间和金钱通常不是无限的,因此您应该将测试集中在变化最多的区域。识别这些区域的一种快速但肮脏的方法是计算给定业务区域中更改的代码行数;说“会计”或“用户管理”。这些应该首先与您确定为“关键任务”的任何领域一起进行最多的测试。
现在我知道更改的代码行不一定是衡量更改的最佳方法。如果您有明确定义的变更请求,那么实际上最好通过查看变更请求的数量和复杂性来评估这些热点。但并不是每个人都拥有这种奢侈。
当您谈论对框架进行更改时,您可能不需要测试使用它的所有代码。如果您谈论的是对 DAL 等内容的更改,那么这基本上就意味着一切。您只需要测试足够大的代码样本,就可以确信更改是可靠的。同样,从“关键任务”区域和受影响最严重的区域开始。
我发现将项目分为 3 个不同的代码流很有帮助;开发、质量保证和生产。开发对所有更改开放,质量保证是功能锁定的,生产是代码锁定的(好吧,无论如何锁定)。如果您按月发布到生产环境,您可能希望在发布前至少 1 个月从开发代码中分支 QA 构建。然后,您用该月的时间对新的更改进行验收测试,并尽可能地对其他所有内容进行回归测试。您可能必须在发布前大约一周完成更改测试,以便可以暂存该应用程序并且您可以试运行安装几次。您不会对所有内容进行回归测试,因此请准备好向生产发布补丁的策略。不要忘记将这些补丁也合并回 QA 和开发代码流中。
自动化回归测试将是一件非常棒的事情;理论上。在实践中,您最终会花费更多的时间来更新测试代码,而不是手动运行测试脚本。此外,您可以用一名真正优秀的测试脚本开发人员的价格雇用两到三只测试猴子。悲伤但真实。
Regression testing is a way of life. You will need to regression test every application before it is released. However, since time and money are not usually infinite, you should focus your testing on the areas with the most changes. A quick and dirty way to identify these areas is to count the lines of code changed in a given business area; say "accounting" or "user management". Those should get the most testing first along with any areas that you have identified as “mission critical”.
Now I know that lines of code changed is not necessarily the best way to measure change. If you have well defined change requests, it is actually better to evaluate these hot spots by looking at the number and complexity of the change requests. But not everyone has that luxury.
When you are talking about making a change to the framework, you probably don't need to test all the code that uses it. If you're talking about a change to something like the DAL, that would basically amount to everything anyway. You just need to test a large enough sample of the code to be reasonably comfortable that the change is solid. Again, start with the "mission critical" areas and the area most heavily affected.
I find it helpful to divide the project into 3 distinct code streams; Development, QA, and Production. Development is open to all changes, QA is feature locked, and Production is code locked (well, as locked as it gets anyway). If you are releasing to production on a monthly cycle, you probably want to branch a QA build from the Development code at least 1 month before the release. Then you spend that month acceptance testing the new changes and regression testing everything else that you can. You'll probably have to complete testing the changes about a week before the release so that the app can be staged and you can dry run the installation a few times. You won't get to regression test everything, so have a strategy ready for releasing patches to Production. Don't forget to merge those patches back into the QA and Development code streams too.
Automating the regression tests would be a really great thing; theoretically. In practice, you end-up spending more time updating the testing code then you would spend running the test scripts manually. Besides, you can hire two or three testing monkeys for the price of one really good test script developer. Sad but true.