分析 Java 堆转储时内存不足

发布于 2024-09-04 18:36:59 字数 422 浏览 1 评论 0原文

我有一个奇怪的问题,我需要分析大小为 1.5GB 的 Java 堆转储(来自 IBM JRE),问题是在分析转储时(我尝试过 HeapAnalyzerIBM Memory Analyzer 0.5) 工具内存不足,我无法真正分析转储。我的机器中有 3GB RAM,但似乎不足以分析 1.5 GB 转储,

我的问题是,您知道用于堆转储分析(支持 IBM JRE 转储)的特定工具吗?我有记忆吗?

谢谢。

I have a curious problem, I need to analyze a Java heap dump (from an IBM JRE) which has 1.5GB in size, the problem is that while analyzing the dump (I've tried HeapAnalyzer and the IBM Memory Analyzer 0.5) the tools runs out of memory I can't really analyze the dump. I have 3GB of RAM in my machine, but seems like it's not enough to analyze the 1.5 GB dump,

My question is, do you know a specific tool for heap dump analysis (supporting IBM JRE dumps) that I could run with the amount of memory I have?

Thanks.

如果你对这篇内容有疑问,欢迎到本站社区发帖提问 参与讨论,获取更多帮助,或者扫码二维码加入 Web 技术交流群。

扫码二维码加入Web技术交流群

发布评论

需要 登录 才能够评论, 你可以免费 注册 一个本站的账号。

评论(2

满身野味 2024-09-11 18:36:59

尝试 SAP 内存分析器工具,它还有一个 eclipse 插件。该工具在处理转储文件时在磁盘上创建索引文件,并且比其他选项需要更少的内存。我很确定它支持较新的 IBM JRE。话虽这么说,对于 1.5 GB 转储文件,您可能别无选择,只能运行 64 位 JVM 来分析该文件 - 我通常估计大小为 n 的堆转储文件需要 <使用标准工具打开strong>5*n内存,使用MAT打开3*n内存,但您的里程将根据转储实际包含的内容而有所不同。

Try the SAP memory analyzer tool, which also has an eclipse plugin. This tool creates index files on disk as it processes the dump file and requires much less memory than your other options. I'm pretty sure it supports the newer IBM JRE's. That being said - with a 1.5 GB dump file, you might have no other option but to run a 64-bit JVM to analyze this file - I usually estimate that a heap dump file of size n takes 5*n memory to open using standard tools, and 3*n memory to open using MAT, but your milage will vary depending on what the dump actually contains.

终弃我 2024-09-11 18:36:59

在 3GB RAM 上分析 1.5GB 堆转储会很困难。因为在这 3GB 中,您的操作系统、其他进程、服务...很容易就会占用 0.5 GB。所以你只剩下 2.5GB。 heapHero 工具在分析堆转储方面非常高效。只需比堆转储的大小多 0.5GB 即可进行分析。你可以尝试一下。但最好的建议是在具有足够内存的计算机上分析堆转储,或者您可以在分析堆转储期间获得一个 AWS ec2 实例。分析堆转储后,您可以终止实例。

It's going to difficult to analyze 1.5GB heap dump on a 3GB RAM. Because in that 3GB your OS, other processes, services,... easily would occupy 0.5 GB. So you are left with only 2.5GB. heapHero tool is efficient in analyzing heap dumps. It should take only 0.5GB more than the size of heap dump to analyze. You can give it try. But best recommendation is to analyze heap dump on a machine which has adequate memory OR you can get an AWS ec2 instance just for the period of analyzing heap dumps. After analyzing heap dumps, you can terminate the instance.

~没有更多了~
我们使用 Cookies 和其他技术来定制您的体验包括您的登录状态等。通过阅读我们的 隐私政策 了解更多相关信息。 单击 接受 或继续使用网站,即表示您同意使用 Cookies 和您的相关数据。
原文