分析 Java 堆转储时内存不足
我有一个奇怪的问题,我需要分析大小为 1.5GB 的 Java 堆转储(来自 IBM JRE),问题是在分析转储时(我尝试过 HeapAnalyzer 和 IBM Memory Analyzer 0.5) 工具内存不足,我无法真正分析转储。我的机器中有 3GB RAM,但似乎不足以分析 1.5 GB 转储,
我的问题是,您知道用于堆转储分析(支持 IBM JRE 转储)的特定工具吗?我有记忆吗?
谢谢。
I have a curious problem, I need to analyze a Java heap dump (from an IBM JRE) which has 1.5GB in size, the problem is that while analyzing the dump (I've tried HeapAnalyzer and the IBM Memory Analyzer 0.5) the tools runs out of memory I can't really analyze the dump. I have 3GB of RAM in my machine, but seems like it's not enough to analyze the 1.5 GB dump,
My question is, do you know a specific tool for heap dump analysis (supporting IBM JRE dumps) that I could run with the amount of memory I have?
Thanks.
如果你对这篇内容有疑问,欢迎到本站社区发帖提问 参与讨论,获取更多帮助,或者扫码二维码加入 Web 技术交流群。
绑定邮箱获取回复消息
由于您还没有绑定你的真实邮箱,如果其他用户或者作者回复了您的评论,将不能在第一时间通知您!
发布评论
评论(2)
尝试 SAP 内存分析器工具,它还有一个 eclipse 插件。该工具在处理转储文件时在磁盘上创建索引文件,并且比其他选项需要更少的内存。我很确定它支持较新的 IBM JRE。话虽这么说,对于 1.5 GB 转储文件,您可能别无选择,只能运行 64 位 JVM 来分析该文件 - 我通常估计大小为 n 的堆转储文件需要 <使用标准工具打开strong>5*n内存,使用MAT打开3*n内存,但您的里程将根据转储实际包含的内容而有所不同。
Try the SAP memory analyzer tool, which also has an eclipse plugin. This tool creates index files on disk as it processes the dump file and requires much less memory than your other options. I'm pretty sure it supports the newer IBM JRE's. That being said - with a 1.5 GB dump file, you might have no other option but to run a 64-bit JVM to analyze this file - I usually estimate that a heap dump file of size n takes 5*n memory to open using standard tools, and 3*n memory to open using MAT, but your milage will vary depending on what the dump actually contains.
在 3GB RAM 上分析 1.5GB 堆转储会很困难。因为在这 3GB 中,您的操作系统、其他进程、服务...很容易就会占用 0.5 GB。所以你只剩下 2.5GB。 heapHero 工具在分析堆转储方面非常高效。只需比堆转储的大小多 0.5GB 即可进行分析。你可以尝试一下。但最好的建议是在具有足够内存的计算机上分析堆转储,或者您可以在分析堆转储期间获得一个 AWS ec2 实例。分析堆转储后,您可以终止实例。
It's going to difficult to analyze 1.5GB heap dump on a 3GB RAM. Because in that 3GB your OS, other processes, services,... easily would occupy 0.5 GB. So you are left with only 2.5GB. heapHero tool is efficient in analyzing heap dumps. It should take only 0.5GB more than the size of heap dump to analyze. You can give it try. But best recommendation is to analyze heap dump on a machine which has adequate memory OR you can get an AWS ec2 instance just for the period of analyzing heap dumps. After analyzing heap dumps, you can terminate the instance.