开始在现有代码库中进行自动化集成/单元测试

发布于 2024-09-11 11:51:53 字数 431 浏览 0 评论 0原文

背景:我们已经移交了一个非常大的代码库(140 万行),主要是 C# 语言。该应用程序主要由 asp.net 2.0 风格的 asmx Web 服务组成,用于访问 SQL Server 2008 数据库以及各种 XML 文件中的数据。没有现有的自动化测试。我们有一个自动的夜间构建 (CC.NET)。

我们希望引入某种程度的自动化测试,但是对如此数量的代码进行粒度级单元测试的重构似乎不太可能。我们的第一个想法是找到一种构建自动化测试的方法,该测试只需使用一组给定的参数调用每个 Web 服务即可为我们提供一定程度的代码覆盖率。似乎是通过一些自动化测试获得最高代码覆盖率的最快方法。这甚至被称为单元测试还是会被认为是其他东西?

我将如何隔离数据存储以获得一致的测试结果?是否有任何测试工具比其他工具更适合这种方法? x单位?多发性硬化症测试?单位?

任何能让我们朝着正确方向开始的建议将不胜感激。谢谢

Background: We have been handed over a very large codebase (1.4 million lines) that is primarily in C#. The application consists primarily of asp.net 2.0 style asmx web services accessing data in a SQL server 2008 database and also in various XML files. There are no existing automated tests in place. We have an automated nightly build in place (CC.NET).

We want to introduce some level of automated testing, but refactoring in granular level unit tests for this amount of code seems unlikely. Our first thought is to find a way to construct automated tests that simply call each web service with a given set of parameters to give us some level of code coverage. Seemed like the quickest way to get the highest amount of code coverage with some automated tests. Is this even called unit testing or would this be considered something else?

How would I go about isolating the data stores to get consistent test results? Would any test tools work better for this approach than others? xUnit? MS tests? NUnit?

Any suggestions to get us started in the right direction would be greatly appreciated. Thanks

如果你对这篇内容有疑问,欢迎到本站社区发帖提问 参与讨论,获取更多帮助,或者扫码二维码加入 Web 技术交流群。

扫码二维码加入Web技术交流群

发布评论

需要 登录 才能够评论, 你可以免费 注册 一个本站的账号。

评论(4

℡Ms空城旧梦 2024-09-18 11:51:53

我的公司一直在用我们的代码库(C 而不是 C#)做类似的事情,总共大约一百万行。步骤如下:

1)编写一些像您所描述的自动化测试来进行系统级测试。
2) 实施新代码应进行单元测试的规则。
3)当一个区域存在一些错误时,修复这些错误的过程应该包括编写基本的单元测试。

关键是 3 不应该需要完整的单元测试(如果更容易人们就会这样做)。如果您将特定模块的测试覆盖率从 0% 提高到 40%,那么您已经取得了巨大进步。

虽然 6 个月的时间可能只占总代码库的 5%,但这 5% 是变化最多的代码,也是最有可能引入 bug 的地方。我现在处理的代码大约有 60% 被集成测试覆盖,15%(按行)被单元测试覆盖。这看起来并不多,但它确实提供了重要的价值,我们的开发工作也从中受益。

编辑:为了回应其他评论,我们目前运行的一组集成测试大约需要 14 个小时。我们现在正在考虑并行运行一些以加快速度。

My company has been doing something similar with our code base (C rather than C#) that totals about a million lines. The steps have gone something like this:

1) Write some automated tests like you describe that do system level tests.
2) Implement the rule that new code shall have a unit test.
3) When an area has a few bugs against it the process of fixing those bugs should include writing a basic unit test.

The point is that 3 shouldn't require a complete unit test (if it's easier people will do it). If you move from 0% test coverage to 40% coverage of a particular module then you've made great progress.

Though 6 months in you may only be up to 5% of the total code base that 5% is the code that is changing the most and where you are most likely to introduce bugs. The code I work on now is about 60% covered by integration tests and 15% (by line) covered by unit tests. That doesn't seem like a lot but it does provide significant value and our development effort has benefited from it.

edit: in response to one of the other comments the current set of integration tests we run take about 14 hours at the moment. We're now looking at running some in parallel to speed them up.

橪书 2024-09-18 11:51:53

严格来说,您描述的测试听起来更像是端到端或集成测试,而不是单元测试。但这不一定是坏事!在您的情况下,从端到端测试到单元测试向下可能会更高效,而不是像在新代码库上那样向上

  1. 为每个 Web 服务 API 至少编写一个简单的测试,以确保您对所有内容都有一定程度的覆盖。
  2. 根据过去的错误报告,识别历史上容易出现故障的 API。为这些 API 编写更广泛的测试。
  3. 每次遇到失败的测试时,请降低一个级别并为该代码路径上调用的方法编写测试。最终您会发现其中一项测试失败了。如果该错误在该级别上不明显,请降低一个级别并重复。
  4. 每次收到新的错误报告时都要冲洗、清洗、重复。

这里的想法是,“根据需要”沿着当前容易失败的代码路径引入单元测试覆盖率。这将为未来强化这些代码路径,并逐渐扩大单元测试覆盖整个应用程序的范围。

The tests that you describe sound like more of an end-to-end or integration test than a unit test, strictly speaking. But that's not necessarily a bad thing! In your situation, it may be productive to work down from the end-to-end tests towards the unit tests, rather than up as you would on a new codebase.

  1. Write at least one simple test for each web service API, to ensure that you have some coverage of everything
  2. Identify the APIs that have historically been prone to failure, based on past bug reports. Write more extensive tests for those APIs.
  3. Every time that you encounter a failing test, drop down a level and write tests for the methods called on that code path. Eventually you'll find that one of those tests is failing. If the bug isn't obvious at that level, drop down a level and repeat.
  4. Rinse, wash, repeat every time you get a new bug report.

The idea here is that you introduce unit-test coverage "as needed" along the code paths that are currently prone to failure. This will harden those code paths for the future, and will gradually expand your unit test coverage over the entire application.

别挽留 2024-09-18 11:51:53

正如 JSBangs 指出的那样,这些称为集成测试。我同意集成测试比单元测试更适合您的情况。有 140 万行代码,我不太确定可以从该代码库中进行单元测试。

为了隔离数据存储,我喜欢执行以下操作:

  1. 最初有一组硬编码数据

  2. 过了一会儿,您应该进行实际创建一堆测试数据的测试。首先运行这些测试,您不再需要硬编码数据。

  3. 当生产中由于一组“错误数据”而出现问题时,请将它们添加到您的测试中。最终,您将拥有一组好的测试数据来测试大多数情况。

另外,请记住,集成测试需要更长的时间来运行。您可能想在测试机器上运行它们,这样它就不会阻塞您的计算机。对于需要几个小时才能运行的测试套件来说,这是很正常的。

As JSBangs pointed out, those are call Integration test. And I agree that integration test is better thqn unit test for your case. With 1.4 million lines of code, I am not exactly sure what you can unit test out of that code base.

For isolating the data stores, I like to do the following:

  1. Have a set of hard coded data to start off with initially

  2. After awhile, you should have test that actually creates a bunch of test data. Run those test first and you no longer need the hard coded data.

  3. When something goes wrong in production due to a set of "bad data", add those to your test. Eventually, you will have a good set of test data that test most of the cases.

Also, keep in mind that integration test takes longer to run. You might want to run them on a testing machine so it doesn't block your computer. It's pretty normal for a test suites that takes hours to run.

独行侠 2024-09-18 11:51:53

您可以查看的一件事是使用诸如 SpecFlow 之类的内容来使用用户故事。我发现这些故事更自然地映射到集成测试,这正是您想要的。使用这些的额外好处是它创建了一组几乎可以由技术以外的团队(即产品管理/业务分析师)使用的用例。

One thing you could look at is using user stories using something like SpecFlow. I 've found that these stories more naturally map to integration tests, which is what you want. The added benefit of using these is it creates a set of almost use cases that could be used by teams that are other than technical (i.e. product manages/business analysts).

~没有更多了~
我们使用 Cookies 和其他技术来定制您的体验包括您的登录状态等。通过阅读我们的 隐私政策 了解更多相关信息。 单击 接受 或继续使用网站,即表示您同意使用 Cookies 和您的相关数据。
原文