BufferedImage.getGraphics() 导致内存泄漏,有修复吗?

发布于 2024-09-04 00:05:02 字数 1897 浏览 1 评论 0原文

我在调用 BufferedImage.getGraphics() 方法的某些框架 API 时遇到问题,从而导致内存泄漏。该方法的作用是始终调用 BufferedImage.createGraphics()。在 Windows 计算机上,createGraphics() 由 Win32GraphicsEnvironment 处理,Win32GraphicsEnvironment 在其字段 displayChanger 内保留一个侦听器列表。当我在 BufferedImage someChart 上调用 getGraphics 时,someChart 的 SurfaceManager(保留对 someChart 的引用)将添加到 侦听器 在 Win32GraphicsEnvironment 中映射,防止 someChart 被垃圾收集。此后不会从 listeners 映射中删除 someChart 的 SurfaceManager。

一般来说,一旦调用 getGraphics,阻止 BufferedImage 被垃圾回收的概括路径如下:

GC Root -> GC Root -> GC Root -> GC Root -> GC Root -> GC Root本地图形环境(Win32GraphicsEnvironment) -> displayChanger(SunDisplayChanger) ->;听众(地图) -> (D3DChachingSurfaceManager) -> bImg(BufferedImage)

我可以更改框架的代码,以便在每次调用 BufferedImage.getGraphics() 后,我保留对 BufferedImage 的 SurfaceManager 的引用。然后,我获取 localGraphicsEnvironment,将其转换为 Win32GraphicsEnvironment,然后使用对 BufferedImage 的 SurfaceManager 的引用调用removeDisplayChangedListener()。但我认为这不是解决问题的正确方法。

有人可以帮我解决这个问题吗?多谢!


更多细节和发现

我尝试添加到 UI 中的组件是在每次重新绘制时调用 BufferedImage.getGraphics()。因此, displayChanger(在SunGraphicsEnvironment内)应该随着组件的增长而增长重新粉刷。

然而,事情的行为很奇怪:

当我计算 UI 上肯定会触发重绘的操作时,然后根据我的计数检查 displayChanger 内的垃圾侦听器的数量,它们不匹配。 (例如,在我点击之前有 8 个听众,我点击了 60 次。毕竟,只有 18 个听众。)

另一方面,如果我打开断点,并单步进入向 displayListeners,每次点击都会在 displayListeners 中产生一个新条目。因此,displayListeners 持有的每个 BufferedImage 都变成了垃圾。

我考虑过作为displayListeners的关键的SurfaceManager可能被共享或重用,但我的实验排除了这种可能性。我还考虑了缓存,并通过使每次重新绘制的调用都是唯一的来故意防止缓存发生。尽管如此,我仍然不知道这是如何发生的以及如何解决泄漏问题。

I'm having problem with some framework API calling BufferedImage.getGraphics() method and thus causing memory leak. What this method does is that it always calls BufferedImage.createGraphics(). On a windows machine, createGraphics() is handled by Win32GraphicsEnvironment which keeps a listeners list inside its field displayChanger. When I call getGraphics on my BufferedImage someChart, someChart's SurfaceManager(which retains a reference to someChart) is added to the listeners map in Win32GraphicsEnvironment, preventing someChart to be garbage collected. Nothing afterwards removes someChart's SurfaceManager from the listeners map.

In general, the summarized path stopping a BufferedImage from being garbage collected, once getGraphics is called, is as follows:

GC Root -> localGraphicsEnvironment(Win32GraphicsEnvironment)
-> displayChanger(SunDisplayChanger) -> listeners(Map) -> key(D3DChachingSurfaceManager) -> bImg(BufferedImage)

I could have changed the framework's code so that after every called to BufferedImage.getGraphics(), I keep a reference to the BufferedImage's SurfaceManager. Then, I get hold of localGraphicsEnvironment, cast it to Win32GraphicsEnvironment, then call removeDisplayChangedListener() using the reference to the BufferedImage's SurfaceManager. But I don't think this is a proper way to solve the problem.

Could someone please help me with this issue? Thanks a lot!


MORE DETAILS AND FINDINGS

The component I'm trying to add to my UI is makes calls to BufferedImage.getGraphics() every time it is repainted. As a result, the number of garbage kept by displayChanger(inside SunGraphicsEnvironment) should grow as the component gets repainted.

However, things a behaving weirdly enough:

when I counted my actions on my UI which would surely trigger repaint, then check the number of garbage listeners inside displayChanger against my count, they don't match up. (eg. There were 8 listeners before my clicks, and I made 60 clicks. After all, there are only 18 listeners.)

On the other hand, if I turn on the breakpoint, and step into the process of adding things to displayListeners, every single click resulted in a new entry in displayListeners. And thus, every BufferedImage held by displayListeners become garbage.

I considered the possibility of SurfaceManager, which is used as the key for displayListeners, may be shared or reused, yet my experiment ruled out this possibility. I also considered caching and I deliberately prevented caching from happening by making every call to repaint unique. Still, I have no clue how this could happen and how to solve the leak.

如果你对这篇内容有疑问,欢迎到本站社区发帖提问 参与讨论,获取更多帮助,或者扫码二维码加入 Web 技术交流群。

扫码二维码加入Web技术交流群

发布评论

需要 登录 才能够评论, 你可以免费 注册 一个本站的账号。

评论(2

寂寞陪衬 2024-09-11 00:05:02

渲染 BufferedImage 后,您应该调用 dispose() 位于 createGraphics() 返回的图形上下文上。这是一个示例和一个列表的类似方法。

附录:这似乎是一个名为 packratting 的对象泄漏;监听器不匹配听起来像是使用调试器的产物。您可能会从文章用软函数堵住内存泄漏中得到一些想法参考文献,作者:Brian Goetz。

After rendering the BufferedImage, you should invoke dispose() on the graphics context returned by createGraphics(). Here's an example and a list of similar methods.

Addendum: This seems like an object leak called packratting; the listener mismatch sounds like an artifact of using the debugger. You might get some ideas from the article Plugging memory leaks with soft references, by Brian Goetz.

翻了热茶 2024-09-11 00:05:02

尝试调用 < code>flush() 当您不再需要图像时。

Try to call flush() when you don't need your image any more.

~没有更多了~
我们使用 Cookies 和其他技术来定制您的体验包括您的登录状态等。通过阅读我们的 隐私政策 了解更多相关信息。 单击 接受 或继续使用网站,即表示您同意使用 Cookies 和您的相关数据。
原文