如何加快 Sonar 的封装设计分析速度?

发布于 2024-09-14 03:12:49 字数 956 浏览 6 评论 0原文

我维护一个大型(> 500,000 LOC)Java 项目的构建过程。我刚刚在夜间构建的末尾添加了声纳分析步骤。但执行需要三个多小时......这不是一个严重的问题(它发生在一夜之间),但我想知道我是否可以加快它的速度(这样我可以在工作时间手动运行它,如果需要的话) )。

我可以调整任何 Sonar、Hudson、Maven 或 JDK 选项来改善这种情况吗?

[INFO]  -------------  Analyzing Monolith
[INFO]  Selected quality profile : Sonar way, language=java
[INFO]  Configure maven plugins...
[INFO]  Sensor SquidSensor...
[INFO]  Java AST scan...
[INFO]  Java AST scan done: 103189 ms
[INFO]  Java bytecode scan...
... (snip)
[INFO]  Java bytecode scan done: 19159 ms
[INFO]  Squid extraction...
[INFO]  Package design analysis...
... (over three hour wait here)
[INFO]  Package design analysis done: 12000771 ms
[INFO]  Squid extraction done: 12277075 ms
[INFO]  Sensor SquidSensor done: 12404793 ms

1200 万毫秒 = 200 分钟。那是很长一段时间了!相比之下,声纳步骤之前的编译和测试步骤只需不到 10 分钟。据我所知,该进程受 CPU 限制;更大的堆没有任何影响。也许由于缠结/重复分析,它必须是这样的,我不知道。当然,我知道拆分项目是最好的选择!但这需要做大量的工作;如果我可以同时调整一些配置,那就太好了。

有什么想法吗?

I maintain the build process for a large (> 500,000 LOC) Java project. I've just added a Sonar analysis step to the end of the nightly builds. But it takes over three hours to execute ... This isn't a severe problem (it happens overnight), but I'd like to know if I can speed it up (so that I could run it manually during work hours if desired).

Any Sonar, Hudson, Maven or JDK options I can tweak that might improve the situation?

[INFO]  -------------  Analyzing Monolith
[INFO]  Selected quality profile : Sonar way, language=java
[INFO]  Configure maven plugins...
[INFO]  Sensor SquidSensor...
[INFO]  Java AST scan...
[INFO]  Java AST scan done: 103189 ms
[INFO]  Java bytecode scan...
... (snip)
[INFO]  Java bytecode scan done: 19159 ms
[INFO]  Squid extraction...
[INFO]  Package design analysis...
... (over three hour wait here)
[INFO]  Package design analysis done: 12000771 ms
[INFO]  Squid extraction done: 12277075 ms
[INFO]  Sensor SquidSensor done: 12404793 ms

12 million milliseconds = 200 minutes. That's a long time! By comparison, the compile and test steps before the sonar step take less than 10 minutes. From what I can tell, the process is CPU-bound; a larger heap has no effect. Maybe it has to be this way because of the tangle / duplication analysis, I don't know. Of course, I know that splitting up the project is the best option! But that will take a fair amount of work; if I can tweak some configuration in the meantime, that would be nice.

Any ideas?

如果你对这篇内容有疑问,欢迎到本站社区发帖提问 参与讨论,获取更多帮助,或者扫码二维码加入 Web 技术交流群。

扫码二维码加入Web技术交流群

发布评论

需要 登录 才能够评论, 你可以免费 注册 一个本站的账号。

评论(2

躲猫猫 2024-09-21 03:12:49

我站在你的立场上:在一个超过 200 万个 loc 项目(实际上,几年前就应该分割成子项目),我从未见过在计算后 4 天内完成包设计分析......

截至 SONAR-2164(添加一个选项以跳过二次“封装设计分析”阶段),我已提交允许用户在其 Maven 项目文件中设置 true 的补丁,以便跳过包设计分析。
该补丁正在等待批准,目前计划包含在 v2.7 中。

I walked in your shoes: on a 2million+ loc project (that should have been split into sub-projects years ago, indeed), I never saw the package design analysis to complete within 4 days of computation...

As of SONAR-2164 (Add an option to skip the quadratic "Package design analysis" phase), I have submitted a patch that would allow users to set true in their maven project file so that the package design analysis is skipped.
This patch is pending approval and is currently scheduled for inclusion in v2.7.

余生一个溪 2024-09-21 03:12:49

来自列表中的 Freddy Mallet:

“...问题不是来自数据库,而是来自识别所有要剪切的包依赖项的算法。...如果您设法将这个项目分成几个模块,那么您的问题就会消失。”

我通过排除一个相对较大的包来测试这个理论,果然它急剧下降。理论上,连接数量可以随着包数量的二次方增长,因此对于如此大的代码库,这种方法可能是尽可能好的。

From Freddy Mallet on the list:

"... the problem doesn't come from the DB but come from the algorithm to identify all the package dependencies to cut. ... If you manage to cut this project in several modules, then your problem will vanish."

I tested this theory by excluding a relatively large package, and sure enough it dropped dramatically. In theory the number of connections could grow quadratically with the number of packages, so this approach is probably as good as is possible with such a large codebase.

~没有更多了~
我们使用 Cookies 和其他技术来定制您的体验包括您的登录状态等。通过阅读我们的 隐私政策 了解更多相关信息。 单击 接受 或继续使用网站,即表示您同意使用 Cookies 和您的相关数据。
原文