CG 3.0会泄露吗?
我发现 CG 似乎存在内存泄漏。我通过 nvidia.com 提交了一份报告,但如果您在此处尝试此操作:
如果删除显示
cgD3D11SetTextureParameter( g.theTexture, g.sharedTex ) ;
泄漏停止的行。
CG 3.0真的泄露了吗?
使用 ATI Radeon 5850 GPU / Windows 7 64 位。
I'm finding CG appears to have a memory leak. I submitted a report via nvidia.com, but if you try this here:
If you remove the line that says
cgD3D11SetTextureParameter( g.theTexture, g.sharedTex ) ;
The leak stops.
Does CG 3.0 really leak?
Using ATI Radeon 5850 GPU / Windows 7 64-bit.
如果你对这篇内容有疑问,欢迎到本站社区发帖提问 参与讨论,获取更多帮助,或者扫码二维码加入 Web 技术交流群。
绑定邮箱获取回复消息
由于您还没有绑定你的真实邮箱,如果其他用户或者作者回复了您的评论,将不能在第一时间通知您!
发布评论
评论(3)
是的,它泄漏了。它在内部每次调用时都会创建一个 ShaderResourceView,并且永远不会释放它。我认为 API 设计不当,他们应该采用 ShaderResourceView* 作为此函数的参数,而不仅仅是 Resource*。
我大约 6 个月前在 nvidia 论坛上发布了此消息,但从未得到回复
您的报告是否公开发布?或者某种私人支持票?
Yes, it leaks. Internally it creates a ShaderResourceView on every call, and never releases it. I think the API is ill-designed, they should have taken a ShaderResourceView* as a parameter to this function, instead of just a Resource*.
I posted about this on nvidia forums about 6 months ago and never got a response
Is your report posted publicly? Or some kind of private support ticket?
是的,每次调用 cgD3D11SetTextureParameter() 时,Cg 3.0 都会发生泄漏,导致应用程序的内存使用量攀升。不幸的是,它使得带有 D3D11 的 Cg 3.0 完全无法使用。其中一个症状是,应用程序运行一段时间后,它将停止渲染并且屏幕将变黑。在发现 Cg bug 之前,我浪费了很多时间试图确定其原因。
如果有人想知道为什么这在 Cg D3D11 演示中不明显,那是因为实际使用纹理的少数演示非常简单,他们只需在开始时调用一次 cgD3D11SetTextureParameter() 即可。
Cg Toolkit 3.1(2012 年 4 月)中仍然存在同样的错误。
Yes, Cg 3.0 leaks every time you call cgD3D11SetTextureParameter(), causing your application's memory usage to climb. Unfortunately it makes Cg 3.0 with D3D11 completely unusable. One symptom of this is that, after a while of your application running, it will stop rendering and the screen will just go black. I wasted a lot of time trying to determine the cause of this before discovering the Cg bug.
If anybody is wondering why this isn't apparent with the Cg D3D11 demos, its because the few that actually use textures are so simple that they can get away with only calling cgD3D11SetTextureParameter() once at the start.
This same bug remains with Cg Toolkit 3.1 (April 2012).
jmp [更新] ;;跳过过时的文本段
是否是Cg在d3d之后被销毁,因此没有按时释放引用?或者反之亦然?例如获取纹理但在 d3d 关闭之前不释放它的函数,因为当您将纹理设置给着色器时,会获取纹理,直到以某种方式释放着色器资源。您正在破坏 d3d 上下文,在这里:
SAFE_RELEASE( g.d3d );
SAFE_RELEASE( g.gpu );
稍后,您可以释放着色器,如下所示 CleanupCg():
cgDestroyProgram( g.v_vncShader );
checkForCgError( "正在销毁顶点程序" );
cgDestroyProgram( g.px_vncShader );
checkForCgError( "正在销毁片段程序" );
尝试更改调用顺序,首先从 cg 和 d3d 释放所有资源,这: cgD3D11SetDevice( g.cgContext, NULL ); 也应该在释放 d3d 上下文之前调用,万一。
更新:
WinMain()
内部应该有所不同:因此您应该交换它们以确保 Cg 释放任何 d3d 指针:
您还可以提供调试器输出和其他信息,如下所示我在那里问,因为你基本上是在说“Cg 似乎被破坏了,这是整个代码,看看 ### 行,它被破坏了吗?”但是你的文件中有超过一千行(1012)的C、C++和着色器代码,你基本上没有提供任何信息,但很容易指出一个Cg错误(基于......什么?),当然,如果你'非常确定,如果代码没问题,为什么有人会看代码呢?顺便说一句,并不是我不喜欢它,而是......它有一些小事情,例如调用顺序,这些都是愚蠢的错误,但可以使调试变得真正地狱,这是一个明显的错误,我可能还会认为,如果我只是查看 Main 并发现了一个 bug,那么渲染调用和 Cg 实现还有很长的路要走,不是吗?我无法在 WinXP 上运行该应用程序,但这些错误出现在最容易预测的地方:)
所以...当您的代码没有任何错误时...哦!看!我刚刚发现的结果
是,在 VertexBuffer 构造函数中,您调用 iD3D->GetImmediateContext( &gpu ); 并将指针存储在私有成员中,所以...不应该吗添加:
好的,您应该在代码中修复一些导致内存泄漏的问题,我只是看了一下,所以您并没有真正尝试。另一方面,看来你的代码很清晰并且充满了解释,我需要学习一些DX11,所以实际上我应该感谢你。不过,否决票有点粗鲁:P,特别是因为我可能是对的,而其他人会在页面显示后避免阅读您的代码。
jmp [UPDATE] ;; skip obsolete text segment
Could it be that Cg is being destroyed after d3d so it doesn't release the reference on time? Or vice-versa? such as the function acquiring the texture but not releasing it before d3d closes, because when you set a texture to a shader, the texture is acquired until shader resources are released somehow. You are destroying the d3d context, here:
SAFE_RELEASE( g.d3d );
SAFE_RELEASE( g.gpu );
Later on, you free the shader, as follows CleanupCg():
cgDestroyProgram( g.v_vncShader );
checkForCgError( "destroying vertex program" );
cgDestroyProgram( g.px_vncShader );
checkForCgError( "destroying fragment program" );
Try to change the order of the calls in a way you first release all resources from both cg and d3d, this:
cgD3D11SetDevice( g.cgContext, NULL );
should also be called before releasing the d3d context, just in case.UPDATE:
This should be different inside
WinMain()
:so you should swap them to ensure Cg to free any d3d pointer:
You could also provide the debugger output and other info as I asked down there, because you're basically saying "Cg seems to be broken, this is the whole code, look the line ###, is it broken?" but there are more than a thousand lines (1012) of C, C++ and shader code in your file, you basically provide no info but readily point to a Cg bug (based on... what?) which of course, if you're so sure, why would anyone look at the code if the code is fine? Which isn't by the way, not that I don't like it but... it got these little things such as the call ordering which are silly mistakes but that can make debugging a real hell, it's a clear bug, and I may also think that if I just looked into Main and found a bug, well there is a long way up to the render call and the Cg implementation, isn't it? I can't run the app on WinXP, but these errors are in the most predictable places :)
So... when your code is clean of any bug... ohh! look! what I've just found..
turns out in VertexBuffer constructor you call
iD3D->GetImmediateContext( &gpu );
and store the pointer in a private member, so... shouldn't you add:Ok so there are some things you should fix in your code that cause memory leaks, and I just took a look on it, so you didn't really try. On the other hand, it seems your code is clear and full of explanations and I need to learn some DX11, so actually I should thank you for it. The downvote was somewhat rude though :P specially because I'm probably right, and other people would avoid reading your code as soon as the page displays.