贵公司的软件开发到底是什么样的(方法、工具……)?

发布于 2024-07-07 23:22:08 字数 799 浏览 6 评论 0原文

自从大约两年前我开始作为专业软件开发人员的第一份工作以来,我阅读了许多关于普遍接受的方法论(例如 Scrum、XP)、技术(例如 EJB、Spring)、技巧(例如 TDD、代码审查)的文章。 )、软件公司的工具(bug跟踪、wiki)等等。

对于其中许多,我发现我们公司并没有使用它们,我问自己为什么。 我们是否做错了,或者仅仅是我读过的这些文章并没有真正讲述现实世界的情况? 这些文章比较学术吗?

那么,请告诉我你们公司的情况如何。 讲述有关软件开发的一切。 这里有一些建议(按照我的想法排列)。 至少告诉你是否这样做,或者给出一个简短的评论:

  • 测试驱动开发
  • 领域驱动设计
  • 模型驱动设计/架构
  • 你测试吗?
  • 单元测试
  • 集成测试
  • 验收测试
  • 代码审查
  • 创新技术(Spring、Hibernate、Wicket、JSF、WS、REST...)
  • 敏捷
  • 结对编程
  • UML
  • 领域特定语言
  • 需求规范(如何?)
  • 持续集成
  • 代码覆盖工具
  • 通用领域模型
  • 沟通(Wiki、邮件、IM、邮件列表、其他文档)
  • 工作量估算
  • 团队规模
  • 会议
  • 代码度量
  • 静态代码分析
  • 错误跟踪
  • ...

请记住:我想知道您真正在做什么,而不是您想要做什么或认为自己做什么应该做。

Since I've started my first job as a professional software developer about two years ago, I've read many articles about commonly accepted methodologies (e.g. Scrum, XP), technologies (e.g. EJB, Spring), techniques (e.g. TDD, code reviews), tools (bug tracking, wikis) and so on in software companies.

For many of these I've found that we at our company doesn't use them and I ask myself why. Are we doing it wrong or is it merely that these articles I've read are not really telling what it's like in the real world? Are these articles more academic?

So, please tell me what it's like at your company. Tell about everything regarding software development. Here are some suggestions (in the order as they come from my mind). Tell at least if you do it or not, or give a short comment:

  • Test-Driven-Development
  • Domain-Driven-Design
  • Model-Driven-Design/Architecture
  • Do you test?
  • Unit Testing
  • Integration Testing
  • Acceptance Testing
  • Code Reviews
  • Innovative Technologies (Spring, Hibernate, Wicket, JSF, WS, REST, ...)
  • Agile
  • Pair Programming
  • UML
  • Domain-specific languages
  • Requirement Specification (How?)
  • Continous Integration
  • Code-Coverage Tools
  • Aenemic Domain Model
  • Communication (Wiki, Mail, IM, Mailinglists, other documents)
  • Effort estimates
  • Team size
  • Meetings
  • Code metrics
  • Static code analysis
  • Bug tracking
  • ...

And remember: I want to know what you really do, not what you would like to do or think you should do.

如果你对这篇内容有疑问,欢迎到本站社区发帖提问 参与讨论,获取更多帮助,或者扫码二维码加入 Web 技术交流群。

扫码二维码加入Web技术交流群

发布评论

需要 登录 才能够评论, 你可以免费 注册 一个本站的账号。

评论(23

夜声 2024-07-14 23:22:08
  • 测试驱动开发 - 没办法。
  • 领域驱动设计 - 什么是设计
  • 模型驱动设计/架构 - 什么是设计? 我们确实有一个架构团队。 除了一个例外(最初级的架构师),他们无法从纸袋中编写代码。 不过,他们确实擅长用线条画方框! 并建立蹩脚、毫无价值、过于通用且完全无用的标准。 (旧的 OPC 内容还可以,但 UA 标准在过去 4 年左右的时间里一直“下个月完成”。)
  • 你测试吗? - 是的,我们确实有一个专门的测试团队。 每 10-12 名开发人员大约有 1 名测试人员。 他们完全被淹没了。 问我我们测试是否顺利
  • 单元测试 - 完全非正式/由开发人员决定。 当我的日程安排允许时我就会这样做。
  • 集成测试 - 是的。 考虑到我们开发和支持的产品套件,这是必要的。
  • 验收测试 - 是的,仅适用于合同工作。
  • 代码审查 - 总是口头上说,从来不做。
  • 创新技术(Spring、Hibernate、Wicket、JSF、WS、REST...) - 强烈反对采用新的依赖项。 Boost 永远不会被采用,例如,我们通常很幸运地获得了较新版本的 .Net,尽管通常落后了 2 年左右。
  • 敏捷 - 不。管理层声称想要“敏捷”,尽管他们并没有表现出对敏捷是什么的最基本的理解。 我们最近刚刚修改了我们的流程,以便指定更高优先级的任务并以......(等等)更高的优先级来实现! 管理层告诉我,这是我们新的“敏捷”流程。 但它的气味、行走方式和嘎嘎声仍然像瀑布一样。
  • 结对编程 - 不可能! 付钱两个人来做一个人的工作? 接下来,您将建议开发人员应该将时间浪费在诸如设计和代码审查之类的废话上。 狗,猫,一起生活!
  • UML - 不。我们曾经有过一个 UML 工具来帮助我们理解已经发展的遗留代码库。 负责评估该工具的人很喜欢它,它在不到 30 秒的时间内对整个百万行 C++ 代码库进行了逆向工程! 在他们被说服购买并且实际开发人员尝试使用它之后,我们发现实际上只花了 30 秒就无法解析 95% 以上的代码库。 错误报告非常糟糕,评估人员甚至没有意识到它失败了。 (我在看着你,孩子!)我们只花了一年半的时间就得到了大约要放弃我们的许可证。 看? 敏捷!
  • 特定于领域的语言 - 它们可能在某个地方使用过,但不是我自己使用的。
  • 需求规范(如何?) - 产品经理执行一些巫术并发明它们。 有时他们甚至可能与顾客谈论它们! 如果你真的很幸运,他们甚至会理解用例和需求之间的区别。 不过,不要指望它。 我们并不真正做用例。
  • 持续集成 - 没办法。 当一切都同时崩溃时,事情会更加令人兴奋。
  • 代码覆盖工具 - 有人曾经在冰冷的服务器机房的源存储库服务器上放了一个毯子。 这算吗?
  • 贫血领域模型 - 说实话,我以前从未听说过这个。
  • 通信(Wiki、邮件、即时消息、邮件列表、其他文档) - 备忘录。 Lotus Notes 不处理“电子邮件”。 一堆新奇的垃圾。
  • 工作量估算 - 并非如此。 在我的组织中,估计是以下代码目标。。 项目的截止日期被锁定在项目瀑布式开发的 5 个“敏捷”阶段的第一个阶段。 这些截止日期被称为“大致估计”,但实际上意味着“发货日期”。
  • 团队规模 - 根据产品涵盖各个领域。 如果包括经理的话,我们的团队小至四人,大至十五人。
  • 会议 - 如果您的资历相对较浅并且开发的产品不超过一两个,那么这还不错。 我每周只需参加 2-3 次 1 小时的会议。
  • 代码指标 - 否。
  • 静态代码分析 - 理论上,.Net b/c FxCop 是内置的,并且它的使用是由我们的标准强制执行的,但实际上,不是。 没有人检查它,因为从来没有任何代码审查。 只是偶尔进行质量审核(又名书面跟踪/指责审核),以确保我们不会失去今年的认证。
  • 错误跟踪 - 是的,但仅限于客户报告的问题。 开发人员不得针对他们正在开发的产品提交发现的错误,因为他们不是“团队合作者”。 (当我犯那个错误时,我老板的老板向我详细解释了这一点。我现在与一位特定客户很友好,他愿意“发现”我可能在其他与支持相关的沟通过程。)

就大型企业开发而言,还有更糟糕的情况。 考虑到我住的地方,以及该地区缺乏高科技工作,我实际上很幸运能有一份工作。 但这并不意味着我必须喜欢事情的现状。 即使试图影响既定的企业文化,也需要大量的时间和持续的压力。

但如果他们厌倦了我改变文化的尝试并解雇了我,那么,我想那天晚上我不会哭着入睡。

  • Test-Driven-Development - No way.
  • Domain-Driven-Design - What's design?
  • Model-Driven-Design/Architecture - What's design? We do have an architecture team. With one exception (the most junior architect), they couldn't code their way out of a paper bag. They're sure good at drawing boxes with lines, though! And establishing crappy, worthless, over-generic and completely useless standards. (The old OPC stuff is OK, but the UA standard has been "done next month" for the last 4 years or so.)
  • Do you test? - Yep, we do have a dedicated test team. There's about 1 tester for every 10-12 devs. They're completely swamped. Ask me if we test well.
  • Unit Testing - Completely informal/up to the developer. I do when the schedule I'm given allows for it.
  • Integration Testing - Yes. This one's necessary given the suite of products we develop and support.
  • Acceptance Testing - Yes, for contract-y work only.
  • Code Reviews - Always pay lip service, never ever do it.
  • Innovative Technologies (Spring, Hibernate, Wicket, JSF, WS, REST, ...) - Taking new dependencies is strongly frowned upon. Boost will never be adopted, e.g. We have generally had good luck getting to newer versions of .Net, though, if typically 2 years or so behind the curve.
  • Agile - No. Management claims to want "agile," though they don't exhibit the barest understanding of what it is. We just recently modified our process so that higher priority tasks are spec'd and implemented with... (wait for it) higher priority! Management tells me that this is our new "agile" process. It still smells, walks, and quacks like a waterfall though.
  • Pair Programming - No way! Pay two people to do the work of one? Next you'll be suggesting that developers should waste time on nonsense like designs and code reviews. Dogs, cats, living together!
  • UML - No. We got a UML tool once to help us understand a legacy codebase that had evolved. The person in charge of evaluating the tool loved it, it reverse engineered the entire million+ line C++ codebase in less than 30 seconds! After they were talked into buying it and actual devs tried to use it, we found that it really just took those 30 seconds to fail to parse 95+% of the codebase. The error reporting was so bad the evaluator hadn't even figured out that it failed. (I'm lookin' at you, kid!) It only took us a year and a half to get around to dropping our licenses for that. See? Agile!
  • Domain-specific languages - They're probably used somewhere, though not by myself.
  • Requirement Specification (How?) - A product manager performs some voodoo and invents them. Sometimes they may even talk with customers about them! If you're really lucky, they'll even understand the difference between a use case and a requirement. Don't count on it, though. We don't really do use cases.
  • Continous Integration - No way. It's far more exciting when everything breaks at once.
  • Code-Coverage Tools - Someone once put a blankey on the source repository server in the cold, cold server room. Does that count?
  • Aenemic Domain Model - In all seriousness, I've never even heard of this before.
  • Communication (Wiki, Mail, IM, Mailinglists, other documents) - Memos. Lotus Notes doesn't do "e-mail". Bunch of newfangled rubbish.
  • Effort estimates - Not really. In my organization, Estimates are code for targets.. The due date for a project is locked in during the first of the project's 5 "agile" phases of waterfall development. Those due dates are called "ballpark estimates" but really mean "ship dates."
  • Team size - Runs the gamut, based on product. We have teams as small as four and as big as fifteen if you include managers.
  • Meetings - Not bad if you're relatively junior and aren't working on more than one or two products. I'm only required to attend 2-3 1-hour meetings per week.
  • Code metrics - No.
  • Static code analysis - Theoretically for .Net b/c FxCop is built in and it's use is mandated by our standard, but really, no. Nobody checks it b/c there are never any code reviews. Just the occasional quality audit (aka, paper-trail/blame audit) to make sure we don't lose whatever this year's certification is.
  • Bug tracking - Yes, but only for customer-reported problems. Developers are not allowed to submit discovered bugs against a product they're working on b/c that's not being a "team player." (My boss' boss explained this to me in great detail when I made that mistake. I'm now friendly with a particular customer who's willing to "discover" bugs that I might "accidentally" mention in the course of other support-related communication.)

As far as big, corporate dev't goes, there's a lot worse out there. Given where I live, and the lack of high-tech jobs in the area, I'm actually pretty lucky to have a gig at all. Doesn't mean I have to like the way things are, though. It just takes a lot of time and constant pressure to even try to influence an established corporate culture.

But if they get sick of my attempts to change the culture and fire me, well, I don't think I'd cry myself to sleep that night.

南风起 2024-07-14 23:22:08

我认为著名的大泥球模式描述了很多工作环境并为您提供一些关于如何应对此类问题的好主意。


顺便说一句,我意识到我没有直接回答你的问题,但大泥球在令人沮丧的大部分开发情况中盛行。 你可以询问测试驱动开发和缺陷跟踪以及其他类似的事情,但如果从我所看到的来看真相,我会说大泥球几乎是人们事实上的工作方式——他们应该还是不应该。

I think the famous Big Ball of Mud pattern describes a lot of work environments and gives you some good ideas about how to combat this kind of thing.


By the way, I realize I'm not directly answering your question but Big Ball of Mud prevails in a depressingly large percentage of development situations. You can ask about test driven development and defect tracking and other sorts of things of that sort but if the truth is told from what I've seen, I'd say the Big Ball of Mud is pretty much the de facto way that people work--whether they should or should not.

梨涡少年 2024-07-14 23:22:08
  • 测试驱动开发 - 快到了。
  • 领域驱动设计 - 无
  • 模型驱动设计/架构 - 否
  • 您进行测试吗? - 是
  • 单元测试 - 是
  • 集成测试 - 是
  • 验收测试 - 没有
  • 代码审查 - 没有
  • 创新/新技术(Spring、Hibernate、Wicket、JSF、WS、REST,...) - ASP.NET MVC? 休眠? 是
  • 敏捷 - 刚刚开始
  • 结对编程 - 没有
  • UML - 没有正式的
  • 领域特定语言 - 没有
  • 需求规范(如何?) - 是。 捕捉故事需求。
  • 持续集成 - 是的。 TeamCity
  • 代码覆盖工具 - 是。 NCover
  • Aenemic 领域模型 - 无
  • 通信(Wiki、邮件、IM、邮件列表、其他文档) - IM、电子邮件
  • 工作量估计 - 1,2,4,8
  • 团队规模 - 4 次
  • 会议 - 每日站会
  • 代码指标 - 无
  • 静态代码分析 -无错误
  • 跟踪 - 现有自定义作业

我的部门正在进行中。 在过去的几个月里,我一直在努力实施持续改进。 有些事情已经很难谈论了。 然而,当回头看时,他们已经进步了。

  • Test-Driven-Development - Almost there.
  • Domain-Driven-Design - No
  • Model-Driven-Design/Architecture - No
  • Do you test? - Yes
  • Unit Testing - Yes
  • Integration Testing - Yes
  • Acceptance Testing - No
  • Code Reviews - No
  • Innovative/New Technologies (Spring, Hibernate, Wicket, JSF, WS, REST, ...) - ASP.NET MVC? NHibernate? Yes
  • Agile - Just started
  • Pair Programming - No
  • UML - Nothing formal
  • Domain-specific languages - No
  • Requirement Specification (How?) - Yes. Capturing story requirements.
  • Continous Integration - Yes. TeamCity
  • Code-Coverage Tools - Yes. NCover
  • Aenemic Domain Model - No
  • Communication (Wiki, Mail, IM, Mailinglists, other documents) - IM, Email
  • Effort estimates - 1,2,4,8
  • Team size - 4
  • Meetings - Daily stand up
  • Code metrics - No
  • Static code analysis - No
  • Bug tracking - Existing custom job

My department is a work in progress. Over the past few months, I've made an effort in enforcing continuous improvement. Some stuff has been down right difficult to talk about. However, when looking back, they have improved.

北凤男飞 2024-07-14 23:22:08
  • 测试驱动开发:是
  • 领域驱动设计:是
  • 模型驱动设计/架构:是
  • 你测试吗? 是
  • 单元测试 - 是
  • 集成测试 - 是
  • 验收测试 - 是
  • 代码审查 - 是
  • 创新技术 - 是
  • 敏捷 - 完全敏捷
  • 结对编程 - 是
  • UML - 有时用于 ddd 白板战斗
  • 特定领域语言 - 是
  • 需求规范(如何?) - 在示例和验收测试的形式
  • 持续集成 - 是 - TeamCity
  • 代码覆盖工具 - 是 - NCover
  • 贫乏域模型 - 不确定
  • 通信(Wiki、邮件、IM、邮件列表、其他文档) - wiki、邮件、msn
  • 团队规模 - 6 + 取决于项目
  • 会议 - 每天早上 9:30 - SCRUM
  • 代码指标 - 不知道
  • 静态代码分析 - 不知道
  • Bug 跟踪 - Mantis

最重要的是...

  • 每个人都在 5:30 回家:

但是工资低很多,因为很多人想在这家公司工作。 不可能拥有一切!

  • Test-Driven-Development: yes
  • Domain-Driven-Design: yes
  • Model-Driven-Design/Architecture: yes
  • Do you test? yes
  • Unit Testing - yes
  • Integration Testing - yes
  • Acceptance Testing - yes
  • Code Reviews - yes
  • Innovative Technologies - yes
  • Agile - solely agile
  • Pair Programming - yes
  • UML - sometimes for ddd whiteboard fights
  • Domain-specific languages - yes
  • Requirement Specification (How?) - in the form of examples and acceptance tests
  • Continous Integration - yes - TeamCity
  • Code-Coverage Tools - yes - NCover
  • Aenemic Domain Model - not sure
  • Communication (Wiki, Mail, IM, Mailinglists, other documents) - wiki, mail, msn
  • Team size - 6+ dependent on project
  • Meetings - every morning at 9:30 - SCRUM
  • Code metrics - dunno
  • Static code analysis - dunno
  • Bug tracking - Mantis

And most importantly...

  • Everyone goes home at 5:30: YES

However the salary is alot lower because alot of people want to work for this company. Can't have everything!

壹場煙雨 2024-07-14 23:22:08
  • 测试驱动开发:不。 最好的情况是,只有很小的一部分。 我们都在谈论,但不要这样做。
  • 领域驱动设计:不。 如果您正在开发技术框架,则很难知道“域”是什么。 在 DDD 方面没有太多经验,不知道如何去做。
  • 模型驱动设计/架构:不。
  • 你测试了吗?:是的,但还不够。 每次发布(我们试图每 8 周推出一次次要版本)总是有超过 2 个服务版本。 我们正处于产品开发的第一年,所以我认为这还不错。
  • 单元测试:是的。 大约在。 30% 的覆盖率。 大多数开发人员现在知道他们应该为自己编写单元测试。 每次他们必须修复自己代码中的关键错误时,他们都会看到如果他们预先编写一个简单的测试来首先防止错误的好处。
  • 集成测试:是的,使用 Selenium 和 WebDriver。
  • 性能测试:是的,从现在开始。 目标是归档长期性能测量并将其与版本进行比较。 为此使用 JMeter 和专用性能测试服务器。
  • 验收测试:并非如此,但我们的产品也在内部使用,并且在发布之前我们很快就收到了反馈。 我将其视为验收测试。
  • 代码审查: 不。 有时,别人看了30分钟,但仅此而已。
  • 创新技术(Spring、Hibernate、Wicket、JSF、WS、REST...):从我的角度来看,这些技术不再是“创新”了。 他们现在几乎是老派了。 JSF 除外,该公司已于几年前消亡。 过去几年是Spring+Hibernate吗? 现在做 Wicket + WS 已经 2 年了。 将 Hibernate 替换为 iBatis SqlMaps。
  • 敏捷:不。
  • 结对编程:不。
  • UML:一点,主要是部署图。 类图过于细粒度,通常与实际情况不同步。 开发人员做他们认为最好的事情,而不是 UML 图告诉他们做的事情。
  • 特定于领域的语言:是的,我们正在使用自制的业务规则技术。 它是一种适合最终用户的可视化 DSL。 有点像使用 Visio 来建模决策。
  • 需求规格(如何?):不。
  • 持续集成:是的。 基于 Hudson 和 Maven。 在每个构建上运行单元测试。 额外的夜间构建启用了更详细的报告。 整个团队都会收到有关构建失败的通知(是的,如果某些事情破坏了链条,并且所有 30 个子模块都会构建失败,例如,当 Maven 存储库无法访问时,他们会抱怨有时会收到太多邮件)
  • 代码覆盖工具: 是的。 Maven/Cobertura 和声纳。
  • 贫血领域模型:不知道这应该是什么。
  • 通信(Wiki、邮件、IM、邮件列表、其他文档):Wiki、邮件、IM、每日站立会议、由专业/非开发人员编写的最终用户和开发人员手册。
  • 努力估计:努力做出好的估计。 但如果没有要求,它们只是粗略的估计。 不过对于资源规划来说已经足够了。
  • 团队规模: 12
  • 会议: 每日站立会议,在每个次要版本发布后每 8 周回顾一次。
  • 代码指标:声纳。 希望遵守大多数规则。 但没有时间重新配置规则来满足我们的需求。
  • 静态代码分析:声纳。
  • 错误跟踪: JIRA

注意:

  • Sonar 是一个代码质量服务器。 它结合了 PMD、Findbugs、Checkstyle 等工具。
  • Test-Driven-Development: Nope. At best, in very tiny portions. We're all talking about, but don't do it.
  • Domain-Driven-Design: Nope. Hard to know what a "domain" is if you're developing a technical framework. Have not much experience in DDD to know how to do it.
  • Model-Driven-Design/Architecture: Nope.
  • Do you test?: Yes, but not enough. With every release (we're trying to push out minor releases every 8 weeks) there're always more than 2 service releases. We're in the first year of product development, so i think this is pretty okay.
  • Unit Testing: Yes. At approx. 30% coverage. Most of the developers know now that they should write unit tests for themselves. Every time they have to fix a critical bug in their own code, they can see the benefit if they would have written a simple test up front to prevent the bug in the first place.
  • Integration Testing: Yes, using Selenium and WebDriver.
  • Performance Testing: Yes, beginning with now. Goal is to archive long-term performance measurements and compare them against releases. Using JMeter and a dedicated performance test server for that.
  • Acceptance Testing: Not really, but our product is used internally too and we're getting feedback pretty fast before it's being released. I count that as acceptance testing.
  • Code Reviews: Nope. Sometimes, someone else looks at it for 30 minutes, but that's it.
  • Innovative Technologies (Spring, Hibernate, Wicket, JSF, WS, REST, ...): From my POV, those technologies are not "innovative" any more. They're pretty much old-school now. Except JSF, which died a couple of years ago. Did Spring+Hibernate for the last couple of years. Now doing Wicket + WS for 2 years. Replaced Hibernate with iBatis SqlMaps.
  • Agile: Nope.
  • Pair Programming: Nope.
  • UML: A little bit, mainly for deployment diagrams. Class diagrams too fine-granular and often are not in sync with reality. Developers do what they think is best, not what an UML diagram tells them to do.
  • Domain-specific languages: Yes, we're using home-brewn business rules technology. It's a visual DSL which is suitable for endusers. Kind of like using Visio to model decisions.
  • Requirement Specification (How?): Nope.
  • Continuous Integration: Yes. Based on Hudson and Maven. Unit tests are run on each build. Additional nightly builds with more detailed reporting enabled. Whole team is notified about failed builds (yeah, they complain about getting too many mails sometimes if something breaks the chain and all 30 submodules get build failures, e.g. when the Maven Repository is unreachable)
  • Code-Coverage Tools: Yes. Maven/Cobertura and Sonar.
  • Aenemic Domain Model: No idea what this is supposed to be.
  • Communication (Wiki, Mail, IM, Mailinglists, other documents): Wiki, Mail, IM, Daily standup meetings, Enduser and developer manuals written by a professional/non-developer.
  • Effort estimates: Trying hard to do good estimates. But without reqs, they are just rough estimations. Good enough for resource planing though.
  • Team size: 12
  • Meetings: Daily standup, retro every 8 weeks after each minor release.
  • Code metrics: Sonar. Looking to comply with most of the rules. Did not have time to reconfigure the rules to suit our needs though.
  • Static code analysis: Sonar.
  • Bug tracking: JIRA

Notes:

  • Sonar is a code quality server. It combines tools like PMD, Findbugs, Checkstyle etc.
剩一世无双 2024-07-14 23:22:08
  • 测试驱动开发 - 无
  • 领域驱动设计 - 无
  • 模型驱动设计/架构 - 否
  • 您进行测试吗? - 几乎从不进行
  • 单元测试 - 几乎从不进行
  • 集成测试 - 没有
  • 验收测试 - 没有
  • 代码审查 - 没有
  • 创新技术(Spring、Hibernate、Wicket、JSF、WS、REST...) - Spring、Hibernate、Wicket
  • Agile - 没有
  • 结对编程- 没有
  • UML - 只是草图
  • 特定于领域的语言 - 没有
  • 需求规格说明(如何?) - 我们获得庞大的客户需求规格说明,我们使用思维导图来提取实际功能,然后进行估计
  • 持续集成 - 没有
  • 代码覆盖工具 - 没有
  • Aenemic 领域模型 - 是的
  • 通信(Wiki、邮件、IM、邮件列表、其他文档) - 思维导图、邮件
  • 工作量估计 - FITA(手指在空中,参见此处)
  • 团队规模 - 2-6 次
  • 会议 - 每周 2-3 次
  • 代码指标- 否
  • 静态代码分析 - 否(尝试过 FindBugs 和 Checkstyle)
  • 错误跟踪 - 是,Bugzilla
  • Test-Driven-Development - No
  • Domain-Driven-Design - No
  • Model-Driven-Design/Architecture - No
  • Do you test? - Almost never
  • Unit Testing - Almost never
  • Integration Testing - No
  • Acceptance Testing - No
  • Code Reviews - No
  • Innovative Technologies (Spring, Hibernate, Wicket, JSF, WS, REST, ...) - Spring, Hibernate, Wicket
  • Agile - No
  • Pair Programming - No
  • UML - just sketches
  • Domain-specific languages - No
  • Requirement Specification (How?) - We get a huge customer requirement specification and we use mind maps to extract the actual features which are then estimated
  • Continous Integration - No
  • Code-Coverage Tools - No
  • Aenemic Domain Model - Yes
  • Communication (Wiki, Mail, IM, Mailinglists, other documents) - Mind maps, Mail
  • Effort estimates - FITA (Finger in the air, see here)
  • Team size - 2-6
  • Meetings - 2-3 times a week
  • Code metrics - No
  • Static code analysis - No (Tried FindBugs and Checkstyle)
  • Bug tracking - Yes, Bugzilla
不醒的梦 2024-07-14 23:22:08

我为你感到难过:)这不是一个好的工作环境,因为你需要不断练习良好的实践才能真正理解和使用它们。

我知道有几家公司(包括我的)能够在您的列表中勾选所有“”框。

然而,细节决定成败,即使在一些拥有良好 SDP 政策的公司中,也不是每个项目都遵循这些政策。

I feel sorry for you :) It's not a good environment to work in, as you need to constantly exercise practise good practices to really understand and use them.

I know several (mine included) companies which would be able to tick all the 'good' boxes in your list.

However the devil is in details and even in some companies with good SDP policies not every project follows them.

固执像三岁 2024-07-14 23:22:08
  • 测试驱动开发 - 如果你的意思是在代码之前编写测试,并不总是
  • 域驱动设计 - 不是纯粹的 DDD
  • 模型驱动设计/架构 - 从来没有,但真的从来没有,
  • 你测试了吗? - 是的,总是
  • 单元测试 - 是的,总是
  • 集成测试 - 这取决于情况,但我们尽量避免它们,因为它们通常很慢
  • 验收测试 - 是的,理想的自动化
  • 代码审查 - 是的,这包含在已完成的创新技术的定义中
  • (Spring ,Hibernate,Wicket,JSF,WS,REST,...) - 不确定提到的技术是否创新,但 Spring,Hibernate,WS
  • Agile 是创新的 - 是的,这是我的 DNA
  • 配对编程 - 并不总是,但是(在新的主题,与新的团队成员,如果明确要求)
  • UML - 一点(即白板上不时出现的类或序列图),仅当有帮助时
  • 特定于领域的语言 - 到目前为止没有真正的使用
  • 需求规范(如何? ) - 轻量级规范(例如用户故事)
  • 持续集成 - 当然(根据我们对done的定义,代码必须已经“集成”)
  • 的定义中
  • 代码覆盖工具 - 是的(cobertura),这是在done Aenemic 领域模型 - 不,我们尽量避免
  • 沟通(维基、邮件、即时通讯、邮件列表、其他文档) - 面对面、维基、即时通讯、邮件、邮件列表(但我们尽量避免文字文档)
  • 努力估计 - 是的,以待办事项级别的故事点为单位,以迭代级别的小时为
  • 单位 团队规模 - 7+/-2 次
  • 会议 - 是的,但只有一次有用,并且始终限定时间(迭代计划、每日会议、演示和回顾)
  • 代码指标 - 是(圈复杂度、代码覆盖率、声纳中收集的编码标准)
  • 静态代码分析 - 是(findbugs、checkstyle)
  • 错误跟踪 - 是的,当然(但我们会在发现错误后立即修复)
  • Test-Driven-Development - if by this you mean writing tests before code, not always
  • Domain-Driven-Design - not pure DDD
  • Model-Driven-Design/Architecture - never, but really NEVER, again
  • Do you test? - yes, always
  • Unit Testing - yes, always
  • Integration Testing - it depends but we try to avoid them as they are typically slow
  • Acceptance Testing - yes, ideally automated
  • Code Reviews - yes, this is included in the definition of done
  • Innovative Technologies (Spring, Hibernate, Wicket, JSF, WS, REST, ...) - not sure the mentioned technologies are innovative but yes to Spring, Hibernate, WS
  • Agile - yes, this is in my DNA
  • Pair Programming - not always but yes (on new subjects, with new team members, if explicitly asked)
  • UML - a little (i.e. a class or a sequence diagram on a whiteboard from time to time), only if helpful
  • Domain-specific languages - no real usage until now
  • Requirement Specification (How?) - lightweight specifications (e.g. user stories)
  • Continous Integration - of course (and according to our definition of done, the code must have been "integrated")
  • Code-Coverage Tools - yes (cobertura), this is in the definition of done
  • Aenemic Domain Model - no, we try to avoid that
  • Communication (Wiki, Mail, IM, Mailinglists, other documents) - face to face, wiki, IM, mail, mailing list (but we try to avoid word documents)
  • Effort estimates - yes, in story points at the backlog level, in hours at the iteration level
  • Team size - 7+/-2
  • Meetings - yes, but only useful one and always time boxed (iteration planning, daily meeting, demo and retrospective)
  • Code metrics - yes (cyclomatic complexity, code coverage, coding standards collected in sonar)
  • Static code analysis - yes (findbugs, checkstyle)
  • Bug tracking - yes, of course (but we try to fix bugs as soon as we discover them)
灰色世界里的红玫瑰 2024-07-14 23:22:08
  • 测试驱动开发——虽然应该是这样,因为这是试图引入的,但我认为它还没有起飞,所以这仍然是一个“否”,但现在有更多细节。
  • 领域驱动设计 - 无
  • 模型驱动设计/架构 - 否
  • 您进行测试吗? - 是的,但不全面。 我们确实有一些单元测试、一些集成测试和一些 WatiN 测试。
  • 单元测试 - 我们有一些用于新开发的单元测试,但旧版没有。
  • 集成测试 - 通常在适用的情况下。 我的团队是网络团队,这种情况似乎还不太常见。
  • 验收测试——我们有几个级别的测试。 第一种情况是,正在开发一项新功能,并且必须获得另一个团队中某人的初步批准,该团队将在与代码集成之前输入内容。 第二个是在 Sprint 结束时演示功能,以获得更多有关看起来不正确或运行良好的反馈。 然后在进入最终制作之前还有第三个关卡,“是的,这不会弄乱我们已经拥有的东西”之类的东西。
  • 代码审查 - 这些不再进行,但可能是一件好事。
  • 创新技术(Spring、Hibernate、Wicket、JSF、WS、REST...) - 我们的项目中应用了一些 RESTful 思想,并且我们正在使用 .Net 框架的一些功能,例如 lambda 表达式。
  • 敏捷 - 我们使用 Scrum 并有站立会议、故事板、迭代计划会议(这实际上是针对冲刺而不是迭代,这是 2 个冲刺,因为在每对冲刺之后,工作都会向高管和其他部门展示,而其他演示适用于架构师和内容输入团队的负责人。)
  • 结对编程 - 我们确实在大多数新开发中结对,这并不被视为繁重的工作。 因此,对于任何想要在网站的培训部分工作的人来说,两个人都可以,而不是只有一个开发人员。
  • UML - 不,UML 工具在我们的新机器中被删除了
  • 领域特定语言 - 不,但有一些术语是公司自己对事物的解释,因为内部产品的某些名称与其他人可能用于各种事物的术语相冲突。
  • 需求规范(如何?)——范围可以是一份详细说明需要做什么的大型文字文档,也可以是关于要做什么的对话,然后尝试这个,然后再尝试那个。
  • 持续集成 - 我们为 CI 运行 Cruise Control.Net,在提交代码更改时使用。
  • 代码覆盖工具 - 不。
  • 贫血领域模型 - 某种程度上,这里并没有真正的大领域模型。
  • 沟通(Wiki、邮件、即时消息、邮件列表、其他文档)- 按重要性顺序:电子邮件、即时消息、电话,然后访问隔间。 每周与应用程序经理举行团队会议,并就大项目进行每日站立会议。
  • 工作量估计——现在这在每个冲刺中都很常见,尽管有时这是通过发送电子表格让每个人输入他们的估计来完成的,Scrum Master 将所有结果结合起来供我们最终查看。
  • 团队规模 - 5 名开发人员,其中包括一名团队领导、一名业务分析师(即 Scrum Master)、一名测试员来监督我们拥有的内容,以及团队外部的其他人员(根据需要弹出),包括实际使用系统的内容作者。
  • 会议——切入正题,简短、有效,通常适合沟通当前的情况。
  • 代码指标 - 据我所知没有。
  • 静态代码分析 - 不。
  • 错误跟踪 - 质量中心用于跟踪缺陷。
    * 源代码控制 - 我们现在使用 Subversion。 对于任何功能或错误,我们都会创建一个新分支,这样我们就可以独立工作,并且在我们正在处理某些事情时不会让我们的提交破坏构建。 然而,我们都共享相同的开发数据库,​​这有时会很有趣。
  • IDE - XP 上的 Visual Studio 2008,使用 .Net 3.5 和 Sitecore 6.1
  • ...

该团队在近 2 个项目中排名第三。我来这里已经很多年了。

CMS 项目是我们都在致力于的一个大项目,尽管有其他人处理的各种支持请求。

在我们任命 IS 副总裁的这一年里,发生了很多变化。 生产更加锁定,并且需要更多工作来完成发布,因为现在有一个检查列表程序和更多可能有用的箍。

  • Test-Driven-Development - Though it should be as this was attempted to be brought in but I don't think it has taken off, so this is still a no but with more details now.
  • Domain-Driven-Design - No
  • Model-Driven-Design/Architecture - No
  • Do you test? - Yes, but not comprehensively. We do have some unit tests, some integration tests and some WatiN tests.
  • Unit Testing - We have some for our new development but the legacy ones don't.
  • Integration Testing - Usually, when it is applicable. My team being the web team doesn't seem to have this too often yet.
  • Acceptance Testing - We have a few levels of this. The first is when a new feature is being developed and has to get an initial approval from someone on another team that will be entering the content that comes before it is even integrated in with the code. The second is when the features get demonstrated at the end of a Sprint to get more feedback about what isn't looking right or working well. Then there is a third level just before it goes into production as a final, "Yes, this doesn't mess up what we have already," sort of thing.
  • Code Reviews - These aren't done anymore but would probably be a good thing to do.
  • Innovative Technologies (Spring, Hibernate, Wicket, JSF, WS, REST, ...) - There are some RESTful ideas being applied in our project and we are using some features of the .Net framework like lambda expressions.
  • Agile - We use Scrum and have stand ups, story board, Iteration Planning Meeting (That is really for the sprint and not an iteration which is 2 sprints as after each pair of sprints the work is shown to executives and other departments while the other demo is for an architect and the head of the content entering team.)
  • Pair Programming - We do have pairing on most new development that isn't seen as grunt work. So for whoever wants to work on the Training part of the site, a pair will do it instead of just one developer.
  • UML - No, and the tool for UML was removed in our new machines
  • Domain-specific languages - No, but there is some terminology that is the company's own interpretations of things as some names of internal products bump against terms others may use for various things.
  • Requirement Specification (How?) - This can range from a big word document spelling out what needs to be done to conversations of what to do and then try this and try that afterward.
  • Continous Integration - We have Cruise Control.Net running for our CI that is used when we commit code changes.
  • Code-Coverage Tools - Nope.
  • Aenemic Domain Model - Somewhat in that there isn't really a big domain model here.
  • Communication (Wiki, Mail, IM, Mailinglists, other documents) - In order of importance: E-mail, IM, phone, then visiting cubicle. There is a weekly team meeting with the manager of applications and daily standups on the big project.
  • Effort estimates - This is now common in each sprint though sometimes this is done by sending out a spread sheet for everyone to put in their estimates that the Scrum Master combines all the results for us to see in the end.
  • Team size - 5 developers with a team lead, a business analyst who is the Scrum Master, a tester to oversee what we have and others outside the team that pop up as needed including content authors to actually use the system.
  • Meetings - Down to business, short, effective and typically good for communicating where things are currently.
  • Code metrics - None that I know.
  • Static code analysis - Nope.
  • Bug tracking - Quality Center is used for tracking defects.
    * Source Control - We are using Subversion now. For any feature or bug we create a new branch so we can work independently and not have our commits break the build as we are working on something. However, we all share the same DB for development which can be interesting at times.
  • IDE - Visual Studio 2008 on XP using .Net 3.5 and Sitecore 6.1
  • ...

The team is on our 3rd team lead in the almost 2 years I've been here.

The CMS project is the big project that we are all working on though there are various support requests that come in that others handle.

There have been a lot of changes in the year that we've had a VP of IS. Production is more locked down and there is more work to get a release done as there is a check list procedure now and more hoops that may be useful.

薯片软お妹 2024-07-14 23:22:08

我是一家小型软件公司的两名程序员之一,该公司的所有者非常不懂技术(“什么是‘浏览器’”——上周有人认真询问过)。

优点是我可以自己选择其中的大部分内容。

测试驱动开发 - 我们的所有者有过糟糕的经历或其他什么。 提到以错误的方式进行测试,我发誓她的行为是出于创伤后应激障碍。

领域驱动设计 - 是的。

模型驱动设计/架构 - 是的。

你测试吗? - 没有。 测试落在销售和销售上。 支持员工和业主。 因此,一旦离开我的开发机器,就不会再进行太多测试。

单元测试——如果我这样做被发现,我可能会被解雇。 这确实是唯一能让我被解雇的事情。

集成测试 - 请参阅“您进行测试吗?”

验收测试 - 嗯,我们必须在某个时候部署它,对吧?

代码审查 - 没有其他人会理解它。 我见过其他人。 但愿我没有。

创新技术(Spring、Hibernate、Wicket、JSF、WS、REST...)- 是的。 但我因此受到批评。 我是一个“孩子”,对过去十年中未发明的任何东西一无所知(尽管我已经 30 岁了,桌上有《数据库系统读物》)。

敏捷——我不瀑布式。 但我也不太擅长敏捷。

结对编程——您不想尝试与在我们公司工作的其他“程序员”交谈。

UML - 不。 但有时我会在白板上画一些带有标识符的方框。 老板们都喜欢这样。 好在他们不更倾向于技术,否则我可能会看到更多的盒子。

特定领域的语言 - 不。

需求规格(如何?)- 不。

持续集成 - 不。

代码覆盖工具 - 不。

贫血领域模型 - 是的。 试过了。 在我的大多数情况下,我不喜欢它,也不使用它。

沟通(Wiki、邮件、IM、邮件列表、其他文档)- 尝试过,但无法获得同事的认可。 我们的 MediaWiki 安装仍然具有默认的徽标图形。

工作量估计 - 我们必须估计每项工作需要多长时间(以小时为单位)。 这就是客户收到的账单。 这就是我们预计在该项目上花费的资金。 当我们寻找新客户并从头开始开发新应用程序时,以及当我们进行错误修复(是的,我们向客户收取费用)、添加功能等时,我们甚至会这样做。

团队规模 - 1. 让我说这不是一个好的工作方式。 能够实时反馈其他有能力的程序员的想法要好得多。

会议——每周几个小时,有时是两倍,很少少于这个时间。 我与客户举行的会议有一半是完全内部的。

代码指标 - 不。

静态代码分析 - 不。

错误跟踪 - 没有我应该做的那么多。

I am one of two programmers at a small software firm with VERY non-technical owners ("what's a 'browser'" - seriously asked last week).

The advantage is that I can choose for myself on most of these points.

Test-Driven-Development - Our owner had a bad experience or something. Mention testing in the wrong way and I'd swear she's acting out of PTSD.

Domain-Driven-Design - Yep.

Model-Driven-Design/Architecture - Yep.

Do you test? - Nope. Testing falls on the sales & support staff and the owners. So nothing gets tested much once it leaves my dev machine.

Unit Testing - If I got caught doing it I might get fired. And its seriously the only thing that could get me fired.

Integration Testing - See "Do you test?"

Acceptance Testing - Well, we have to deploy it sometime, right?

Code Reviews - No one else would understand it. I've seen everyone elses. Wish I hadn't.

Innovative Technologies (Spring, Hibernate, Wicket, JSF, WS, REST, ...) - Yep. But I get flak for it. I'm the "kid" who doesn't know anything that wasn't invented in the last ten years (despite being 30 and having "Readings in Database Systems" on my desk).

Agile - I don't waterfall. But I don't really Agile, either.

Pair Programming - You don't want to try to talk to the other "programmer" that works at our company.

UML - Nope. But I draw boxes with identifiers in them sometimes on my whiteboard. The bosses like that. Good thing they're not more technically inclined, or I'd probably see more boxes.

Domain-specific languages - Nope.

Requirement Specification (How?) - Nope.

Continous Integration - Nope.

Code-Coverage Tools - Nope.

Aenemic Domain Model - Yep. Tried it. Most of my situations I don't like it for and don't use it.

Communication (Wiki, Mail, IM, Mailinglists, other documents) - Tried, couldn't get coworker buy-in. Our MediaWiki install still has the default logo graphic.

Effort estimates - We have to estimate how long every job will take in hours. That is what the client gets billed. And that is what we are expected to spend on the project. We even do this when we are looking at new clients and developing new apps from scratch, as well as when we do bug fixes (yeah, we charge clients for that), feature additions, etc.

Team size - 1. And let me say this is not a good way to work. It is much better to be able to bounce ideas of other capable programmers in real time.

Meetings - few hours worth a week, sometimes double that and rarely less than that. Half the meetings I do are with clients, have are totally internal.

Code metrics - Nope.

Static code analysis - Nope.

Bug tracking - Not as much as I should.

夢归不見 2024-07-14 23:22:08

我在澳大利亚布里斯班的一家 Ruby on Rails 咨询公司工作。

  • 测试驱动开发:我们办公室非常非常努力地推动这一点。 不进行测试被视为“极其愚蠢”。 您编写代码,如何通过 CI 等自动化流程确保它仍然有效? 测试。

  • 你测试吗?:参见第一点。

  • 单元测试:一直使用 RSpec。 我对 Test::Unit 和 Shoulda 也很“流利”。

  • 集成测试:再次强调,一直以来,Cucumber。

  • 验收测试:通过我们的票据,我们“交付”它们并附有验收标准。 然后,客户必须通过跟随弹跳球来“接受”或“拒绝”它们。 验收标准的好处是,它也是我们编写的 Cucumber 功能的内容。

  • 代码审查 && 结对编程:我们结对。 这是代码审查的即时版本。 这太棒了,因为您可以坐下来观看其他人工作,他们编写测试,然后您编写代码以使测试通过。 如果您生病了,那么其他人就会知道您在做什么,并且可以与其他人配对。

  • 创新技术:因为我们使用 Rails,所以我们非常喜欢 REST。

  • 敏捷:我们使用 Pivotal Tracker 进行 2 周迭代。

  • 需求规范(如何?):使用 Pivotal Tracker 中的功能,客户可以指定他们的需求,然后我们将其充实(通常通过与他们交谈)为验收标准,并最终实现 Real世界功能。

  • 持续集成:我们使用我开发的 CI 服务器,名为 construct 。 它的构建理念与 Integrity 类似,但具有后台工作和对分支机构更好的支持。 现在 Integrity 已经有了后台构建,仍然有分支支持让我们保持“领先”。

  • 代码覆盖工具:有时是 RCov。 我们并不真正担心代码覆盖率,因为我们在编写代码之前测试了所有内容。 如果存在,就会有一些东西来测试它。

  • 沟通(Wiki、邮件、IM、邮件列表、其他文档):我们主要使用电子邮件与客户沟通,但我们也使用 Skype 与他们进行“站立”。 我们还为此使用了 Basecamp。 我想在我们的下一个项目中使用 Google Wave,只是作为一个小实验。 我认为这真的很有帮助。

  • 团队规模:我们的团队有 4 人,通常是 2 对,除非有人生病了。

  • 会议:我们在早上举行“站会”/“站会”,持续约 15 分钟。 这样做的想法是,你回顾一下前一天所做的事情、遇到的任何问题、今天要做的事情以及发现的“新的和闪亮的”东西。 这不应该变成项目会议。 如果需要的话,这些是在站立之后使用的。
  • 错误跟踪:我们再次使用 Pivotal Tracker。 客户可以在这里提交错误,然后(理想情况下)编写如何复制它。 然后我们编写一个测试以确保这种情况不会再次发生(又名:回归测试),并且它经历与我们的功能相同的工作流程,我们交付并客户接受。

I work for a Ruby on Rails consultancy in Brisbane, Australia.

  • Test-Driven-Development: This is pushed very, very hard in our office. Not testing is viewed as "incredibly stupid". You write the code, how do you ensure by way of an automated process such as CI, that it still works? Tests.

  • Do you test?: See point one.

  • Unit Testing: All the time, using RSpec. I'm "fluent" in Test::Unit and Shoulda also.

  • Integration Testing: Again, all the time, Cucumber.

  • Acceptance Testing: With our tickets we "deliver" them with an Acceptance Criteria. Then the client has to either "Accept" or "Reject" them by following the bouncing ball. The Acceptance Criteria has the bonus of also being what our Cucumber features are written in.

  • Code Reviews && Pair Progamming: We pair. It's the instant version of code review. It's awesome because you can sit back and watch someone else work, they write the test and then you write the code to make that test pass. If you're sick then the other person knows what you were up to and can pair with somebody else.

  • Innovative Technologies: Because we use Rails, we're really fond of REST.

  • Agile: We work on 2 week iterations using Pivotal Tracker.

  • Requirement Specification (How?): Using features in Pivotal Tracker the client can specify what requirements they have and then we flesh them out (usually by talking with them) into acceptance criteria, and eventually Real World features.

  • Continous Integration: We use a CI server I developed called construct. This was built with the idea of being like Integrity, but with background jobs and better support for branches. Now that Integrity has background building, there's still the branching support keeping us "ahead".

  • Code-Coverage Tools: RCov sometimes. We're not really fussed with code coverage as we test EVERYTHING before we write it. If it exists, there's something that will test it.

  • Communication (Wiki, Mail, IM, Mailinglists, other documents): We communicate with our clients using Email primarily, but we also have "standups" with them using Skype. We've also used Basecamp for this. I'm wanting to use Google Wave for our next project, just as a little experiment. I think it'd be really helpful.

  • Team size: We have 4 people in our team, usually 2 pairs unless someone's sick.

  • Meetings: We have a "scrum"/"standup" in the mornings lasting about 15 minutes. The idea of this is that you go over what you did the previous day, any problems you encountered, what you're going to do today and something "new and shiny" you found. This should not turn into a project meeting. Those are for after the standup if required.
  • Bug tracking: Again we use Pivotal Tracker. The client can file a bug here and then (ideally) write how to duplicate it. Then we write a test to ensure that this should never happen again (aka: Regression Testing) and it goes through the same work process as our features, we deliver and the client accepts.
杀手六號 2024-07-14 23:22:08

我在南非的 Chillisoft.co.za 工作

测试驱动开发:自第一本 Kent Beck 书籍以来,我们一直在使用测试驱动开发实践。 我们使用 NUnit 和 R# 作为测试运行程序。

你测试吗?:除了 TDD 之外,我们还进行手动测试(可视化),并且在必要时实现自动化。 用于自动化的技术取决于 UI 技术。

单元测试:请参阅 TDD。

集成测试:是的,但我们尚未普遍使用。

验收测试:我们为外部客户进行定制软件开发,在他们接受之前您不会获得报酬,因此是的。

代码审查:每个项目每两个月安排一次。 即使是那些已经结对/对等编程的。

结对编程:我们大部分编码都是结对进行的,但肯定有一些任务和项目的某些阶段效率较低。 我们所做的是在项目启动期间(每个阶段的前几周)我们结对编程。 在最后阶段我们不会。 当我们从事开源项目时,我们也有特定的时间(每个开发人员每周 8 小时),这些都是结对编程的。 我们所有的机器都配备了多个键盘和鼠标,以促进开发人员之间的流畅交互。

创新技术:我们在 Habanero 上做了大量的工作,并使用它框架以及 DI 容器 Unity 和 RhinoMocks。

敏捷:我们使用敏捷理念已经有 8 年了,并且在沿着这条道路前进的过程中,我们将继续尝试工具和理念。

需求规范(如何?):我们为 MSWord 中的下一次迭代捕获用户故事(用例)。 然后,我们在 Jeera 中捕获这些内容的摘要,并通过工作量估计等来管理绘制图表等。

持续集成:我们目前使用的是在 SVN 之上工作的 Hudson。

代码覆盖工具:作为夜间构建的一部分,我们为每个项目运行代码覆盖率。 我们已将生成的报告集成到 Hudson 报告中,以便我们可以每天跟踪每个项目的这些报告。

沟通(Wiki、邮件、IM、邮件列表、其他文档):显然,我们以多种不同的方式进行沟通,我们有内部 Wiki 等。

团队规模:我们有 15 名软件开发人员。

会议:我们每天早上都会举行一次“scrum”,持续约 10 分钟。

错误跟踪:我们使用不同的系统进行内部错误跟踪(即在开发和内部测试期间)和外部错误跟踪(即来自客户的错误)。 内部跟踪(即内部测试和开发期间)我们使用redmine。 我们使用 Mantis 进行外部跟踪(即针对我们的客户)。

I Work for Chillisoft.co.za in South Africa

Test-Driven-Development: We have been using Test Driven Development practices since the first Kent Beck Book it is enforced throughout. We use NUnit and R# as the test runner.

Do you test?: In addition to TDD we do manual testing (Visual) and this is automated where necessary. Technologies used for automation depends on UI Technologies.

Unit Testing: See TDD.

Integration Testing: Yes but we not yet used ubiquitously.

Acceptance Testing: We do custom software development for external customers you don't get paid untill they accept hence yes.

Code Reviews: Scheduled bimonthly for every project. Even those that have been pair/peer programmed.

Pair Progamming: We do most of our coding in pairs but there are certainly some tasks and some stages of the project where this is less efficient. What we do is during project startup (first few weeks of each phase) we pair program. In the finishing stages we do not. We also have specific times (8 hours per week per developer) when we work on open source projects these are all pair programmed. All our machines are setup with multiple keyboards and mouse to facilitate the smooth interaction between devs.

Innovative Technologies: We have done a large amount of work on Habanero and use this framework along with a DI container Unity and RhinoMocks.

Agile: We have been using agile philosophies for 8 years and are continuing to experiment with tools and Philosophies as we continue down this path.

Requirement Specification (How?): We capture user stories (Use Cases) for the next iterations in MSWord. We then capture the summary of these in Jeera with effort estimates etc which manages draw down graphs etc.

Continous Integration: We are currently using Hudson which works on top of SVN.

Code-Coverage Tools: We run code coverage for every project as part of our nightly build. We have intergrated the resulting report into the Hudson reports so that we can track these daily for every project.

Communication (Wiki, Mail, IM, Mailinglists, other documents):Obviously we commmunicate in many different manners we have an internal Wiki etc.

Team size: We have 15 software developers.

Meetings: We have a "scrum" every the mornings lasting about 10 minutes.

Bug tracking: We use different systems for internal bug tracking (i.e. during development and internal testing) and for external bug tracking i.e. bugs from customers. Internal tracking (i.e. during internal testing and dev) we use redmine. External tracking (i.e. for our customers) we use Mantis.

心意如水 2024-07-14 23:22:08

•测试驱动开发- 刚刚开始接管,到目前为止非常满意

•您进行测试吗? - 当然,每个人都这样做,谁希望 QA 嘲笑他们呢?

•单元测试 - 大约 2 年了,它有助于提高稳定性,并且每次构建都会运行测试

•代码审查 - 到处进行,尤其是任何最新的更改

•敏捷 - 喜欢敏捷及其响应能力

•结对编程 - 只是尝试一下一些景点,早期回报有希望

•持续集成-CruiseControl.NET 为赢! 如此巨大的帮助

•代码覆盖工具 - 在每个单元测试运行期间,CC.NET 都会向全世界发布此信息

•通信(Wiki、邮件、IM、邮件列表、其他文档)- WIKI、IM、Campfire

•团队规模- 规模较小,当产品团队规模扩大时,我们会分解为功能团队

• 会议 - 简短且不频繁,更有可能是走廊聚会

• 代码指标 - 仅圈复杂度

• 静态代码分析 - 真正尝试这个 更多使用 FxCop 和VSTS 自主开发的

•Bug 跟踪 - Windows 版 TFS 和 Mac 版 Traq

•Test-Driven-Development - Just starting to take over, very happy with it so far

•Do you test? - Of course, everyone does, who wants QA laughing at them?

•Unit Testing - For about 2 years now, has helped with stability and tests are run every build

•Code Reviews - Here and There, especially with any late changes

•Agile - Love Agile and its repsonsiveness

•Pair Programming - Just trying it out in a few spots, early returns promising

•Continous Integration - CruiseControl.NET for the Win!!! Such a huge help

•Code-Coverage Tools - Always during every unit test run, CC.NET publishes this info out to the world

•Communication (Wiki, Mail, IM, Mailinglists, other documents) - WIKI, IM, Campfire

•Team size - small, when a product team gets to big we break down into feature teams

•Meetings - short and not often, more likely to get hallway get togethers

•Code metrics - Only cyclomatic complexity

•Static code analysis - Really trying this More use FxCop and VSTS's homegrown

•Bug tracking - TFS for windows and Traq for Mac

蓝色星空 2024-07-14 23:22:08

测试:我进行了大量的系统测试,以及少量的单元测试。 当测试驱动开发有意义时,我会尝试使用它,但感觉大多数时候它对于我正在做的事情的核心没有意义。

至于其余的,我不确定我是否正确地执行了“特定于域的语言”,但我确实使用了大量自动生成的代码来捕获工作中的重复内容 - 我数了数 9 个 Perl 脚本生成了几乎100,000 行代码。

至于其余的,团队规模始终是一。 我大约每年使用一次 PC-Lint 进行静态代码分析。 我非常频繁地使用 gprof 和 valgrind (你似乎没有提到此类工具)。 多年来我一直渴望有一个合适的错误跟踪系统,但目前仍在使用待办事项列表软件和电子邮件来处理它。

Testing: I do a lot of system testing, and a far smaller amount of unit testing. I try to use test driven development when it makes sense, but it feels like most of the time it doesn't make sense for the core of what I'm doing.

As for the rest, I'm not sure if I properly do "domain-specific languages" or not, but I do use a lot of automatically generated code to capture the repetitive stuff in my work -- I count 9 Perl scripts generating nearly 100,000 lines of code.

As for the rest, team size is always one. I use PC-Lint for static code analysis about once a year. I use gprof and valgrind quite heavily (you don't seem to have mentioned this class of tools). I've been pining for a proper bug tracker system for years now, but am still using to-do list software and e-mail to handle it at the moment.

风吹短裙飘 2024-07-14 23:22:08
  • 测试驱动开发:偶尔有人可能会为组件这样做。 此外,实施附带一致性测试的公共规范提供了 TDD 的一些优点,而且还有很多优点。
  • 领域驱动设计:无
  • 模型驱动设计/架构:否
  • 您测试吗?:是
  • 单元测试:一些,但不完整。 许多组件都是供客户使用的库。 “strlen”实现的单元测试和功能测试之间存在着微妙的界限。
  • 集成测试:不是真的,单元测试和系统测试之间几乎没有什么
  • 验收测试:是的,并且验收测试的子集用作系统测试
  • 代码审查:没有正式的流程,但一些代码得到审查
  • 创新技术(Spring、Hibernate、Wicket、JSF) ,WS,REST,...):没有
  • 敏捷:没有
  • 结对编程:没有
  • UML:没有
  • 特定于领域的语言:非常偶尔
  • 需求规范(如何?):有点
  • 持续集成:没有,但是每日构建和故障恢复-导致测试团队自行决定进行更改
  • 代码覆盖工具:没有正式要求,已知测试团队使用它们
  • 贫乏域模型:我什至不知道这是什么
  • 通信(Wiki,Mail,IM,Mailinglists,其他)文档):所有这些都是临时选择的,除了需求和设计文档必须是源代码控制下的 HTML,并且内部接口文档是从标头中的 Doxygen 注释生成的。
  • 工作量估计:一点
  • 团队规模:大约 20 名程序员,不同地分为 1-4 人的组件团队。 几乎没有人专门从事他们所属团队的组件。
  • 会议:每周举行一次全体会议,交换进度报告并分享进展情况。 没有为开发人员定期安排的其他会议:根据需要安排讨论。
  • 代码指标:无
  • 静态代码分析:除非你算作 -迂腐;-)
  • 错误跟踪:Bugzilla,在某种程度上与源代码管理集成
  • Test-Driven-Development: very occasionally someone might do it for a component. Also, implementing a public specification which comes with conformance tests offers some of the advantages of TDD, and lots of that goes on.
  • Domain-Driven-Design: no
  • Model-Driven-Design/Architecture: no
  • Do you test?: yes
  • Unit Testing: some, although not complete. A lot of components are libraries for customer use. There's a fine line between unit and functional testing of a "strlen" implementation.
  • Integration Testing: not really, there's little between unit and system tests
  • Acceptance Testing: yes, and subsets of the acceptance tests used as system tests
  • Code Reviews: no formal process, but some code gets reviewed
  • Innovative Technologies (Spring, Hibernate, Wicket, JSF, WS, REST, ...): no
  • Agile: no
  • Pair Programming: no
  • UML: no
  • Domain-specific languages: very occasionally
  • Requirement Specification (How?): sort of
  • Continous Integration: no, but daily builds and reversion of failure-causing changes at discretion of the test team
  • Code-Coverage Tools: no formal requirement, test team have been known to use 'em
  • Aenemic Domain Model: I don't even know what this is
  • Communication (Wiki, Mail, IM, Mailinglists, other documents): all of them, chosen ad hoc except that requirement and design docs must be HTML under source control, and internal interface documentation is generated from Doxygen comments in headers.
  • Effort estimates: a bit
  • Team size: about 20 programmers, variously grouped into component teams of 1-4 people. Pretty much nobody works exclusively on the component whose team they belong to.
  • Meetings: weekly full meeting to exchange progress reports and otherwise share what's going on. No other regularly scheduled meetings for developers: discussions arranged as required.
  • Code metrics: no
  • Static code analysis: not unless you count -pedantic ;-)
  • Bug tracking: Bugzilla, somewhat integrated with source-control
胡大本事 2024-07-14 23:22:08
  • 测试驱动开发:没有。
  • 领域驱动设计:无
  • 模型驱动设计/架构:我们确实从应用程序的模型开始,
  • 您测试了吗? 是的。 有时我是唯一测试我的东西的人。 我讨厌这个。
  • 单元测试 - 没有。 这是我的技能中缺乏的一个领域,我认为需要优先采取补救措施。
  • 集成测试 -
  • 有时没有验收测试。 即使有来自 On High 的威胁,也很难让用户接受它。
  • 代码审查 - 没有。 我们讨论过这样做,但最终没有这样做。 我对此感到沮丧。
  • 创新技术 - 没有
  • 敏捷 - 我们是温和的敏捷,尽管不是完全通过预谋的努力
  • 结对编程 - 没有
  • UML - 我们需要的很少,但我们做模型(我们在这里更刻意地敏捷)。
  • 特定领域的语言 - 没有
  • 需求规范(如何?) - 我们有。 我的团队有时主要负责需求收集。 我们现在通常由业务分析师协助,但情况并非总是如此。 副总统有时会向我们提出一些我不知道来自哪里的要求。 有时它们是从未做过但很久以前就计划好的旧事情。 通常收集的需求会放入 Word 文档中,由主要用户以及我的团队、业务分析师和 Veep 进行审查。
  • 持续集成 - 没有
  • 代码覆盖工具 - 没有
  • 异步域模型 - 我不熟悉他的,但不。
  • 沟通(Wiki、邮件、即时消息、邮件列表、其他文档)- 只需发送电子邮件和面对面。 我最近提出了这个主题,因为我们需要做更多的事情,最好是维基百科。 它被搁置了。
  • 工作量估计 - 是的,但我们不会尝试真正跟踪它们。 这是我另外一个不足的地方。
  • 团队规模 - 3 人,包括我自己(总监 <- 团队领导 <- 我)。
  • 会议——我们每周会面一次,但并非总是如此。 老板通常每周至少单独来几次。 较大的小组会议偶尔举行。 当然,我们会根据需要安排会议来讨论项目要求。
  • 代码指标 - 不
  • 静态代码分析 - 不
  • 错误跟踪 - 我们记录错误。 这几乎就是我们的错误跟踪。

就是这样。 我觉得我们有一些可以改进的地方。

更新:

在我发布这篇文章几周后(11 月 8 日早些时候),我们因大规模裁员而失去了业务分析师。 此后,我在现有应用程序和最近开发的应用程序中实现了 ELMAH,以协助错误跟踪(我们还登录到数据库),我喜欢它的易用性、功能和功能捕获我们未捕获的异常的能力(这在很大程度上未使用,但仍然具有很好的覆盖范围)。 我仍在自己研究单元测试——我真的需要加快步伐(我也想学习 MVC,但我也主要研究它)。

我们现在正在设计一个新的应用程序,我正在为 6 个模块中的 3 个模块(团队负责人正在处理其他 3 个)制作模拟数据库模式(这将获得一些基本图表)。 我并不期待它,因为这将由我们 3 个人(每人 2 个模块)使用 IronSpeed Designer (6.1) 协同开发。 IronSpeed 可以为我做一些我喜欢的事情,但这并不是快速完成这些事情的唯一方法,而且它可以做一些我不喜欢的事情。

其他一切都没有改变。

  • Test-Driven-Development: no.
  • Domain-Driven-Design: no
  • Model-Driven-Design/Architecture: we do start with models for our apps
  • Do you test? yes. Sometimes I'm the only person who tests my stuff. I hate this.
  • Unit Testing - no. This is a lacking area in my skillset I consider high priority to rememdy.
  • Integration Testing - no
  • Acceptance Testing - sometimes. Tough to get the users to go through with it, even with threats from On High.
  • Code Reviews - no. We have discussed doing it but never do in the end. I'm frustrated about this.
  • Innovative Technologies - no
  • Agile - we're mildly agile, though not precisely through a pre-meditated effort
  • Pair Programming - no
  • UML - as little as we need to, but we do model (we're more deliberately agile here).
  • Domain-specific languages - no
  • Requirement Specification (How?) - we do. My group is sometimes primarily responsible for requirements gathering. We are typically assisted by a Biz Analyst now but that wasn't always so. The Veep sometimes hands us requirements that come from I don't know where. Sometimes they're old things that were never done but had been planned ages ago. typically gathered requirements are placed into a Word document reviewed by the primary users as well as my team, the Biz Analyst, and the Veep.
  • Continous Integration - nope
  • Code-Coverage Tools - no
  • Aenemic Domain Model - I'm not familiar with his, but no.
  • Communication (Wiki, Mail, IM, Mailinglists, other documents) - just email and face to face. I broached this subject recently because we need to do something more, a wiki preferrably. It was placed on the back burner.
  • Effort estimates - yes, but we don't make any attempt to really track them. This is another area where I am lacking.
  • Team size - 3, myself included ( Director <- Team Leader <- Me).
  • Meetings - we meet once a week, though not always. Boss usually checks in at least a few times a week individually. Larger group meetings take place sporadically. And of course we schedule meetings to hash out project requirements as necessary.
  • Code metrics - nope
  • Static code analysis - nope
  • Bug tracking - we log errors. that's pretty much our bug tracking.

That's it. We have areas I feel like we could improve on.

Update:

We lost our business analyst to a large layoff a couple of weeks after I posted this (early November 08). I've since implemented ELMAH in an existing application and a more recently developed one to assist in bug tracking (we also log to a database) and I love it for the ease of use, the features, and the ability to catch exceptions we aren't catching (which is largely unused, but still nice coverage to have). I'm still poking around with Unit Testing on my own - I really need to pick up the pace there (I also want to learn MVC but I mostly poke around with that too).

We're desinging a new application right now, and I'm doing a mock DB schema (which will get some basic diagrams) for 3 of the 6 modules (Team Leader is working on the other 3). I'm not looking forward to it, since this will be developed in tandem by the 3 of us (2 modules each) using IronSpeed Designer (6.1). There are things IronSpeed will do for me that I like, but it's not the only way to do these things quickly, and it does some things I don't care for.

Nothing else has changed.

风向决定发型 2024-07-14 23:22:08

我的公司已经采用了大多数“流行语”方法。 单元测试、测试驱动开发、Scrum、敏捷、持续集成、代码覆盖率分析等。
我发现随着团队规模随着经济的变化而变化,我们正在从一个产品跳到另一个产品。 经过大量裁员后,我们从 Rally Dev/Scrum 转向 Jira/Agile。
我们正在使用 Selenium 进行自动化测试,但现在正在考虑 Tellenium 和 Google 的 WebDriver。

我们发现了什么? 通过了为其创建的所有测试(包括负载测试)的站点在真正分析时可能效率低得令人难以置信。 经过代码性能分析后,我们能够将其中一个站点的服务器资源削减 2/3,并且仍然具有更好的性能。 它仍然通过了相同的测试。

前端自动化测试无法捕获人类在几秒钟内就会注意到的定位问题。 当然,我们可以花几个小时编写测试来检查定位。 但测试很脆弱,当页面布局发生变化时,即使只有一点点变化,也必须重写。 测试通常只是表明代码可以工作,而不是表明它有多好。

我曾在使用许多不同技术的大大小小的公司工作过。 包括简单的“牛仔编码”。 当我们不采用规划和测试方法时,会出现更多错误,但我们的行动速度要快得多。 我们在几小时内(而不是几天或一周)推出了更改和修复。

Facebook 每周(周二)都会进行一次“推送”。 最新的代码推送中经常存在错误(测试不够?),但他们通常会在周四或周五之前进行另一次推送以解决任何问题。 我的猜测是 Facebook 更接近“牛仔编码”方法,并且它一直为他们工作。

My company has jumped on most of the "buzzword" methodologies. Unit Testing, Test Driven Development, Scrum, Agile, Continuous Integration, Code-Coverage analysis, etc.
I find we are jumping from product to product as team sizes change with the economy. We shifted from Rally Dev/Scrum to Jira/Agile after plenty of layoffs.
We are using Selenium for automated testing, but now looking at Tellenium and Google's WebDriver.

What are we finding? Sites that have passed every test created for it (including load testing), can be incredible inefficient when truly analyzed. After a code performance analysis we were able to cut server resources by 2/3 for one of our sites, and still had better performance. It still passed the same tests too.

Front-end automated testing does not catch positioning issues that a human would notice in seconds. Sure, we could spend a few hours writing tests to check for positioning. But the tests are brittle and have to be rewritten when page layouts change, even just a little. Testing usually just indicates the code works, not how good it is.

I've worked at big and small companies using many different technologies. Including simple "cowboy coding". There were a lot more bugs when we didn't employ planning and testing methodologies, but we moved a lot quicker. We pushed out changes and fixes in hours, not days and week.

Facebook does a "push" every week (Tuesdays). Often enough there are bugs in the latest code push (not enough testing?), but they often do another push by that Thursday or Friday to fix any issues. My guess is Facebook is closer to the "cowboy coding" methodology and it's been working for them.

能怎样 2024-07-14 23:22:08

以下是我的观察结果:

  • 测试驱动开发:否

  • 领域驱动设计:是

  • 模型驱动设计/架构:是

  • 你测试吗? : 是的

  • 单元测试:是

  • 集成测试:是

  • 测试:是

    验收测试:是

  • 代码审查:否

  • 创新技术(Spring、
    休眠、Wicket、JSF、WS、REST、
    ...) : 是

  • 敏捷结对编程:否

  • UML:是

  • 特定领域语言:是

  • 需求规范(如何?)是

  • 持续集成:是

  • 代码覆盖工具:否

  • 通用领域模型:否(这是什么意思?)

  • 通信(维基、邮件、即时通讯、
    邮件列表、其他文档):Wiki、邮件、IM、邮件列表

  • 工作量估计:是

  • 团队规模:2-4 名成员

  • < p>会议:每周一固定会议和每隔一天的浮动会议

  • 代码指标:是

  • 静态代码分析:否

  • 错误跟踪:是

Here are my observations:

  • Test-Driven-Development : No

  • Domain-Driven-Design : Yes

  • Model-Driven-Design/Architecture : Yes

  • Do you test? : Yes

  • Unit Testing : Yes

  • Integration Testing : Yes

  • Acceptance Testing : Yes

  • Code Reviews : No

  • Innovative Technologies (Spring,
    Hibernate, Wicket, JSF, WS, REST,
    ...) : Yes

  • Agile Pair Programming : No

  • UML : Yes

  • Domain-specific languages : Yes

  • Requirement Specification (How?) Yes

  • Continous Integration : Yes

  • Code-Coverage Tools : No

  • Aenemic Domain Model : No (What do we mean by this ?)

  • Communication (Wiki, Mail, IM,
    Mailinglists, other documents) : Wiki, Mail, IM, Mailinglists

  • Effort estimates : Yes

  • Team size : 2-4 members

  • Meetings : Every Monday fix meetings and every other day floating meetings

  • Code metrics : Yes

  • Static code analysis : No

  • Bug tracking : Yes

野の 2024-07-14 23:22:08
  • 测试驱动开发 - 是
  • 领域驱动设计 - 否
  • 模型驱动设计/架构 - 否
  • 您进行测试吗? - 是
  • 单元测试 - 是
  • 集成测试 - 是
  • 验收测试 - 开始
  • 代码审查 - 否
  • 创新技术(Spring、Hibernate、Wicket、JSF、WS、REST,...) - 否?
  • 敏捷 - 是的
  • 结对编程 - 是的,几乎所有时间
  • UML - 没有什么比白板上的线条和方框更正式的了。
  • 特定于领域的语言 - 一点
  • 需求规范(如何?) - 不,如果可能的话,我们会尝试获取用户故事
  • 持续集成 - 是
  • 代码覆盖工具 - 否
  • 通用领域模型 -
  • 通信(Wiki、邮件、IM、邮件列表、其他文档) ) - Wiki、IM、电子邮件、Word 文档
  • 努力估算 - 我们结合使用 T 恤尺寸(S、M、L、XL 等)和按冲刺速度进行冲刺的积分系统。
  • 团队规模 - 6->8 次
  • 会议 - 每日站会
  • 代码指标 - 无
  • 静态代码分析 - 无错误
  • 跟踪 - Bugzilla / 版本一
  • Test-Driven-Development - Yes
  • Domain-Driven-Design - No
  • Model-Driven-Design/Architecture - No
  • Do you test? - Yes
  • Unit Testing - Yes
  • Integration Testing - Yes
  • Acceptance Testing - Started
  • Code Reviews - No
  • Innovative Technologies (Spring, Hibernate, Wicket, JSF, WS, REST, ...) - No?
  • Agile - Yes
  • Pair Programming - Yes almost all of the time
  • UML - Nothing more formal than lines and boxes on whiteboards.
  • Domain-specific languages - A little
  • Requirement Specification (How?) - No, we try to get user stories if possible
  • Continous Integration - Yes
  • Code-Coverage Tools - No
  • Aenemic Domain Model -
  • Communication (Wiki, Mail, IM, Mailinglists, other documents) - Wiki, IM, Email, Word Docs
  • Effort estimates - We use a combination of T-Shirt size (S, M, L, XL etc) and a points system for sprint by sprint velocity.
  • Team size - 6->8
  • Meetings - Daily stand up
  • Code metrics - No
  • Static code analysis - No
  • Bug tracking - Bugzilla / Version One
蓝色星空 2024-07-14 23:22:08
  • 测试驱动开发 - 无
  • 领域驱动设计 - 无
  • 模型驱动设计/架构 - 否
  • 您进行测试吗? - 有时
  • 进行单元测试 - 几乎从不
  • 进行集成测试 - 是
  • 验收测试 - 有时进行
  • 代码审查 - 仅偶尔
  • 进行创新技术(Spring、Hibernate、Wicket、JSF、WS、REST...) - 无
  • 敏捷 - 无
  • 结对编程 - 无
  • UML -在我的标记板上,是的。
  • 特定领域语言 - C++ 是特定领域的,对吗?
  • 需求规格(如何?) - 我想我们满足了他们。
  • 持续集成 - 是
  • 代码覆盖工具 - 否
  • 贫乏的领域模型 - 什么是领域模型
  • 通信(Wiki、邮件、IM、邮件列表、其他文档) - 电子邮件和其他文档 Skype。 什么是维基?
  • 工作量估计 - 任何给定任务需要 1-2 天
  • 团队规模 - 2 名软件工程师,10 名硬件工程师
  • 会议 - 每周 2 次
  • 代码指标 - 否
  • 静态代码分析 - 否
  • 错误跟踪 - 否
  • Test-Driven-Development - No
  • Domain-Driven-Design - No
  • Model-Driven-Design/Architecture - No
  • Do you test? - Sometimes
  • Unit Testing - Almost never
  • Integration Testing - Yes
  • Acceptance Testing - Sometimes
  • Code Reviews - Only occasionally
  • Innovative Technologies (Spring, Hibernate, Wicket, JSF, WS, REST, ...) - No
  • Agile - No
  • Pair Programming - No
  • UML - on my marker board, yes.
  • Domain-specific languages - C++ is domain specific, right?
  • Requirement Specification (How?) - I think we meet 'em.
  • Continous Integration - Yes
  • Code-Coverage Tools - No
  • Aenemic Domain Model - What's a domain model
  • Communication (Wiki, Mail, IM, Mailinglists, other documents) - Email & Skype. What's a wiki?
  • Effort estimates - 1-2 days for any given task
  • Team size - 2 software engineers, 10 hardware engineers
  • Meetings - 2 times a week
  • Code metrics - No
  • Static code analysis - No
  • Bug tracking - No
早乙女 2024-07-14 23:22:08
  • 测试驱动开发 - 不,是故意的。
  • 领域驱动设计 - 不,我们仍在弄清楚领域。
  • 模型驱动设计/架构 - 不,
  • 你测试吗? - 我们测试构建,并让高级用户进行测试。
  • 单元测试 - 没有正式的内容(没有 nUnit 等)
  • 集成测试 - 没有
  • 验收测试 - 是
  • 代码审查 - 偶尔。
  • 创新技术 - 随机 SharePoint 工具
  • 敏捷 - 是
  • 结对编程 - 没有
  • UML - 从不
  • 特定于领域的语言 - 否
  • 需求规范(如何?) - 我们对此保持关注并进行迭代。 我们有一个 BA 负责进行一些需求分析,但通常我们只是邀请客户参加我们的计划和日常会议。 没有正式文档。
  • 持续集成 - 是 (cruisecontrol.net)
  • 代码覆盖工具 - 否(但我们使用 Visual Studio 代码分析)
  • 沟通 - Outlook
  • 工作量估计 - 大概,加倍,然后再加倍。
  • 团队规模 - 2-4 次
  • 会议 - 每天上午 9 点(scrum!)加上每周计划/审查会议
  • 代码指标 - 不 错误
  • 跟踪 - Bugzilla
  • 源代码控制 - SVN
  • Test-Driven-Development - Nope, on purpose.
  • Domain-Driven-Design - Nope, we're still figuring out the domain.
  • Model-Driven-Design/Architecture - Nope
  • Do you test? - We test the builds, and get power-users to test.
  • Unit Testing - Nothing formal (no nUnit etc)
  • Integration Testing - No
  • Acceptance Testing - Yes
  • Code Reviews - Occassionally.
  • Innovative Technologies - random SharePoint tools
  • Agile - Yes
  • Pair Programming - No
  • UML - Never
  • Domain-specific languages - Nope
  • Requirement Specification (How?) - We stay light on this and iterate. We have a BA that does some requirements analysis, but usually we just invite the customer to our planning and daily meetings as we go. No formal docs.
  • Continous Integration - Yes (cruisecontrol.net)
  • Code-Coverage Tools - No (but we do use Visual Studio Code Analysis)
  • Communication - Outlook
  • Effort estimates - ballpark, double it, then double it again.
  • Team size - 2-4
  • Meetings - everyday at 9am (scrum!) plus a weekly planning/review meeting
  • Code metrics - Nope
  • Bug tracking - Bugzilla
  • Source Control - SVN
浅唱ヾ落雨殇 2024-07-14 23:22:08

你看过 NDepend 吗? 该工具分析 C# 和 .NET 代码,并具有大量很酷的功能来浏览分析结果。 使用 NDepend,您可以在代码上编写规则,您可以比较代码库的 2 个版本您可以利用 80 个代码指标

此外,该工具还具有几个出色的可视化功能,例如:

依赖关系图:
alt text

依赖关系矩阵:
alt text

通过树形图进行代码指标可视化:
替代文本

Did you have a look at NDepend? The tool analyze C# and .NET code and comes with plenty of cool features to browse the analysis results. With NDepend you can write rules on code, you can compare 2 versions of the code base and you can harness more than 80 code metrics.

Also, the tool comes with several great visualization features like:

Dependency Graph:
alt text

Dependency Matrix:
alt text

Code metric visualization through treemaping:
alt text

瞎闹 2024-07-14 23:22:08

很高兴听到 MDA、DDD 和结对编程没有在任何地方使用:D Martin Fowler 不是神,只是有一些奇怪想法的人。

  • 测试驱动开发 - 如果你想要
  • 单元测试 - 是
  • 代码审查 - 有点
  • 创新技术(Spring、Hibernate、Wicket、JSF、WS、REST,...) - 是,Seam
  • UML - 有点
  • 持续集成 - 是和否
  • 代码覆盖工具 - 是和否

its nice to hear that MDA, DDD and Pair Programming is not used anywhere :D Martin Fowler is not god, just guys with some weird ideas.

  • Test-Driven-Development - if you want to
  • Unit Testing - yes
  • Code Reviews - kindda
  • Innovative Technologies (Spring, Hibernate, Wicket, JSF, WS, REST, ...) - yes, Seam
  • UML - kindda
  • Continous Integration - yes and no
  • Code-Coverage Tools - yes and no
~没有更多了~
我们使用 Cookies 和其他技术来定制您的体验包括您的登录状态等。通过阅读我们的 隐私政策 了解更多相关信息。 单击 接受 或继续使用网站,即表示您同意使用 Cookies 和您的相关数据。
原文